JOB DETAILS

Senior Manager Data Engineering

CompanyUnitedHealth Group
LocationChennai
Work ModeOn Site
PostedApril 28, 2026
About The Company
UnitedHealth Group is a health care and well-being company with a mission to help people live healthier lives and help make the health system work better for everyone. We are 340,000 colleagues in two distinct and complementary businesses working to help build a modern, high-performing health system through improved access, affordability, outcomes and experiences. Optum delivers care aided by technology and data, empowering people, partners and providers with the guidance and tools they need to achieve better health. UnitedHealthcare offers a full range of health benefits, enabling affordable coverage, simplifying the health care experience and delivering access to high-quality care. We work with governments, employers, partners and providers to care for 147 million people and share a vision of a value-based system of care that provides compassionate and equitable care. At UnitedHealth Group, our mission calls us, our values guide us and our diverse culture connects us as we seek to improve care for the consumers we are privileged to serve and their communities. Click below to search careers or join our social communities: • Search & apply for careers at careers.unitedhealthgroup.com/ • Follow us on Twitter at twitter.com/UnitedHealthGrp • Follow and like us on Facebook at facebook.com/unitedhealthgroup • Follow us on Instagram at instagram.com/unitedhealthgroup More about UnitedHealth Group can be found at unitedhealthgroup.com/
About the Role

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.

 

Primary Responsibilities:

  • Own the design, build, and operation of reliable data pipelines and curated data products that power analytics, reporting, and downstream applications
  • Set engineering standards for data quality, observability, security, and cost/performance across the data platform
  • Design and implement scalable ETL/ELT pipelines (batch and/or streaming) with strong guarantees: idempotency, incremental processing/CDC, backfills, and schema evolution
  • Model and publish well-defined datasets in a cloud data warehouse/lakehouse (dimensional modeling, partitioning/clustering, performance tuning)
  • Build orchestration workflows (Airflow/Dagster/Prefect) with robust dependency management, retries, SLAs, and operational runbooks
  • Establish data quality and testing practices (unit/integration tests for transformations, validation rules, anomaly detection, and data contract checks)
  • Implement end-to-end observability (structured logging, metrics, lineage/metadata where available, alerting on freshness/volume/drift/failure modes)
  • Partner with Analytics, Product, and Engineering to translate requirements into maintainable data products and clear dataset contracts/documentation (e.g., data dictionaries)
  • Apply security and governance controls for sensitive data (PII/PHI), including least-privilege access, auditing, and retention policies
  • Drive CI/CD and infrastructure-as-code for data workloads (Git-based workflows, automated checks, environment promotion)
  • Mentor engineers, review designs/code, and lead incident response and postmortems for data reliability issues
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so


Required Qualifications:

  • Proven experience building and operating data pipelines at scale (incremental loads, CDC patterns, backfills, late-arriving data handling)
  • Experience with workflow orchestration (Airflow/Dagster/Prefect) and operational ownership (SLAs, on-call readiness, runbooks)
  • Hands-on experience with a modern cloud data stack (AWS/GCP/Azure) and a warehouse/lakehouse (e.g., Snowflake/BigQuery/Redshift/Databricks)
  • Experience with CI/CD and IaC (Terraform/CloudFormation) for reproducible, auditable deployments
  • Familiarity with distributed processing (Spark or equivalent) and performance debugging (partitioning, shuffles, skew)
  • Solid understanding of data modeling (star/snowflake, SCDs) and dataset lifecycle management
  • Solid fundamentals in security/compliance for data (encryption, access control, auditing; handling PII/PHI)
  • Advanced SQL: complex joins, window functions, query optimization, and warehouse performance tuning
  • Proven solid programming skills in Python (preferred) with production engineering practices (packaging, testing, linting, code review)

 

Preferred Qualifications:

  • Domain experience in healthcare/claims/EHR data (or other regulated environments) and associated compliance practices
  • Streaming/event-driven architectures (Kafka/Kinesis/Pub/Sub), exactly-once/at-least-once tradeoffs, watermarking concepts
  • Data governance/metadata tooling (catalogs, lineage, ownership) and data contract frameworks

 

What success looks like (first 3–6 months)

  • Critical pipelines are reliable, observable, and meet freshness SLAs with clear ownership and runbooks
  • New/updated datasets ship with quality checks and documentation, and changes are managed via versioned contracts
  • Platform cost/performance improves measurably through optimization and better operational discipline

 

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Key Skills
Data engineeringPythonSQLCloud data stackETL/ELT pipelinesData modelingWorkflow orchestrationAirflowDagsterPrefectCI/CDInfrastructure-as-codeData qualityDistributed processingSparkData governance
Categories
TechnologyData & AnalyticsManagement & LeadershipSoftwareHealthcare
Benefits
Comprehensive benefitsCareer development opportunities
Job Information
📋Core Responsibilities
The Senior Manager will design, build, and operate reliable data pipelines and curated data products to support analytics and reporting. They will also establish engineering standards for data quality, security, and performance while mentoring engineering staff.
📋Job Type
full time
📊Experience Level
5-10
💼Company Size
90823
📊Visa Sponsorship
No
💼Language
English
🏢Working Hours
40 hours
Apply Now →

You'll be redirected to
the company's application page