Primary Skills: Terraform, Python Spark, SQL, Databricks, Privacera

Secondary skills: Java/J2EE, VB.Net, C#, Python

Scope of Work

  • The team will be working in one of the following areas:
    • Multi-cloud data exploration
      • Terraform infrastructure-as-code for managing AWS infrastructure and deep integration between enterprise tools (Starburst, Privacera, and Databricks) and client's services (LDAP, data decryption)
      • Testing user flows for data analysis, processing, and visualization with Python Spark notebooks and SQL running on distributed compute to join data between AWS S3 and GCP BigQuery
      • Developing data pipelines in Python Spark or SQL to push structured enterprise tool telemetry to our data lake
    • Fine-grained access control for data exploration
      • Terraform infrastructure-as-code for managing AWS infrastructure and deep integration between enterprise tools (Databricks and Privacera)
      • Evaluating Databricks capabilities to sync Hive, Glue, and Unity Catalogs
      • Evaluating Privacera capabilities or building new capabilities (AWS Lambda with Python) to sync client's access policies with Unity Catalog
      • Testing user flows for data analysis, processing, and visualization with Python Spark notebooks on distributed compute or Databricks’ serverless SQL runtime

Responsibilities

  • Develop and implement operational capabilities, tools, and processes that enable highly available, scalable, and reliable customer experiences
  • Resolve defects/bugs during QA testing, pre-production, production, and post-release patches
  • Work cross-functionally with various Intuit teams including: product management, analysts, data scientists, and data infrastructure
  • Work with external enterprise support engineers from Databricks, Starburst, and Privacera to resolve integration questions and issues
  • Experience with Agile Development, SCRUM, or Extreme Programming methodologies

Qualifications

  • 8+ years experience designing and developing web, software, or mobile applications.
  • 3+ years experience building and operating cloud infrastructure solutions.
  • BS/MS in computer science or equivalent work experience.
  • Expertise with any of the following Object Oriented Languages (OOD): Java/J2EE, C#, VB.NET, Python, or sometimes C++. Java and Python preferred.
  • Expertise with AWS (IAM, VPC), Spark, and Terraform are preferred. Expertise with Databricks is a strong bonus.
  • Expertise with the entire Software Development Life Cycle (SDLC), including: system design, code review, unit/integration/performance testing, build and deploy automation
  • Operational excellence: minimizes costs and maximizes uptime
  • Excellent communication skills: demonstrated ability to explain complex technical topics in an engaging way to both technical and non-technical audiences, both written and verbally

Senior Data Engineer

企業サイトでの申請
Back to search page