Optum Logo

Optum

Senior Data Engineer - Remote

Posted Yesterday
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Minnetonka, MN
92K-164K Annually
Senior level
In-Office or Remote
Hiring Remotely in Minnetonka, MN
92K-164K Annually
Senior level
Design and maintain data pipelines and models, optimize data processing on cloud platforms, and enforce data governance while collaborating with various stakeholders.
The summary above was generated by AI
Requisition Number: 2351293
For those who want to invent the future of health care, here's your opportunity. We're going beyond basic care to health programs integrated across the entire continuum of care. Join us to start Caring. Connecting. Growing together.
You'll enjoy the flexibility to work remotely * from anywhere within the U.S. as you take on some tough challenges. For all hires in the Minneapolis or Washington, D.C. area, you will be required to work in the office a minimum of four days per week.
Primary Responsibilities:
  • Design, develop, and maintain scalable data pipelines using Python, PySpark, and other modern programming languages to support both batch and streaming workloads
  • Build and optimize data processing frameworks on cloud platforms such as Databricks or Snowflake, ensuring performance, reliability, and cost efficiency
  • Design and implement robust data models, including transactional (OLTP) and dimensional (OLAP) schemas, to support analytics, reporting, and application integration
  • Develop high quality SQL code including complex queries, stored procedures, and views, with a focus on performance tuning and efficient data access patterns
  • Create and manage workflow orchestration using Apache Airflow or similar tools, ensuring reliable scheduling, dependency management, and monitoring
  • Implement and enforce data governance and metadata standards through tools such as Microsoft Purview, including data lineage, classification, cataloging, and security policies
  • Build automated data quality and validation frameworks to ensure accuracy, completeness, and reliability of production datasets
  • Collaborate with cross functional teams including data architects, analysts, scientists, and business stakeholders to understand requirements and deliver scalable, well designed data solutions
  • Lead technical design sessions and code reviews, promoting engineering best practices, reusability, and maintainability
  • Support cloud infrastructure and DevOps practices, including CI/CD pipelines, version control, testing automation, and environment management
  • Monitor and troubleshoot production data pipelines, proactively addressing issues, performance bottlenecks, and system failures
  • Contribute to the evolution of the enterprise data platform, recommending tools, frameworks, and architectures to improve scalability and efficiency

You'll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in.
Required Qualifications:
  • 7+ years of experience in data engineering, software engineering, or similar disciplines
  • Hands-on experience with Databricks or Snowflake
  • Experience with orchestration tools such as Apache Airflow
  • Experience working with cloud ecosystems (Azure preferred; AWS/GCP acceptable)
  • Advanced SQL skills and experience with OLTP and OLAP data modeling
  • Solid understanding of modern data warehousing, data lake, and ELT/ETL design patterns
  • Solid programming expertise in Python, PySpark, or similar languages
  • If you are offered this position, you will be required to provide extensive personal information to obtain and maintain a suitability or determination of eligibility for a Confidential/Secret or Top Secret security clearance as a condition of your employment
  • United States Citizenship

Preferred Qualifications:
  • Healthcare industry experience, including claims, clinical, FHIR, HL7, or provider data
  • Experience with containerization (Docker, Kubernetes) for data workloads
  • Experience supporting machine learning workflows or analytical data science pipelines
  • Familiarity with data governance tools, especially Microsoft Purview
  • Knowledge of distributed computing concepts and performance tuning

*All employees working remotely will be required to adhere to UnitedHealth Group's Telecommuter Policy
Pay is based on several factors including but not limited to local labor markets, education, work experience, certifications, etc. In addition to your salary, we offer benefits such as, a comprehensive benefits package, incentive and recognition programs, equity stock purchase and 401k contribution (all benefits are subject to eligibility requirements). No matter where or when you begin a career with us, you'll find a far-reaching choice of benefits and incentives. The salary for this role will range from $91,700 to $163,700 annually based on full-time employment. We comply with all minimum wage laws as applicable.
Application Deadline: This will be posted for a minimum of 2 business days or until a sufficient candidate pool has been collected. Job posting may come down early due to volume of applicants.
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
OptumCare is an Equal Employment Opportunity employer under applicable law and qualified applicants will receive consideration for employment without regard to race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status, or any other characteristic protected by local, state, or federal laws, rules, or regulations.
OptumCare is a drug-free workplace. Candidates are required to pass a drug test before beginning employment.

Top Skills

Apache Airflow
AWS
Azure
Databricks
Docker
GCP
Kubernetes
Microsoft Purview
Pyspark
Python
Snowflake
SQL

Similar Jobs at Optum

12 Days Ago
In-Office or Remote
92K-164K Annually
Senior level
92K-164K Annually
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Big Data Software Engineer will design and maintain scalable solutions, develop APIs, evaluate new tools, and collaborate with teams to enhance big data platforms.
Top Skills: AzureCassandraDatabricksDockerHadoopJavaJSONKafkaKubernetesPythonScalaSnowflakeSparkSQLWeb Services
20 Days Ago
In-Office or Remote
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design, build, test, document, and maintain scalable big-data solutions. Develop integrations and APIs, evaluate new tools, participate in design/code reviews, provide production support, and collaborate across teams using Agile practices.
Top Skills: APIsAzureCassandraDatabricksDockerHadoopJavaJSONKafkaKubernetesNoSQLPythonRules EngineScalaSnowflakeSoaSparkSQLWeb Services
34 Minutes Ago
In-Office or Remote
135K-231K Annually
Expert/Leader
135K-231K Annually
Expert/Leader
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead enterprise AI architecture and strategy for Medicare and Retirement: define AI/ML vision and roadmaps, design scalable AI platforms and data pipelines, govern standards, engage executives, and coach teams to deliver healthcare AI solutions.
Top Skills: Agentic AiAi/Ml FrameworksAPIsBpmsCloud-NativeData GovernanceData PipelinesEvent-Driven ArchitectureFhirGenerative AiHealthcare EdiHl7MicroservicesAzureMlopsModel DeploymentNoSQLRagRules EnginesSoaStreaming TechnologiesVector Dbs

What you need to know about the Boston Tech Scene

Boston is a powerhouse for technology innovation thanks to world-class research universities like MIT and Harvard and a robust pipeline of venture capital investment. Host to the first telephone call and one of the first general-purpose computers ever put into use, Boston is now a hub for biotechnology, robotics and artificial intelligence — though it’s also home to several B2B software giants. So it’s no surprise that the city consistently ranks among the greatest startup ecosystems in the world.

Key Facts About Boston Tech

  • Number of Tech Workers: 269,000; 9.4% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Thermo Fisher Scientific, Toast, Klaviyo, HubSpot, DraftKings
  • Key Industries: Artificial intelligence, biotechnology, robotics, software, aerospace
  • Funding Landscape: $15.7 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Summit Partners, Volition Capital, Bain Capital Ventures, MassVentures, Highland Capital Partners
  • Research Centers and Universities: MIT, Harvard University, Boston College, Tufts University, Boston University, Northeastern University, Smithsonian Astrophysical Observatory, National Bureau of Economic Research, Broad Institute, Lowell Center for Space Science & Technology, National Emerging Infectious Diseases Laboratories

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account