JOB DETAILS

Senior Software Engineer (Data Platform)

CompanyDatabricks
LocationBengaluru
Work ModeOn Site
PostedApril 26, 2026
About The Company
Databricks is the Data and AI company. More than 20,000 organizations worldwide — including adidas, AT&T, Bayer, Block, Mastercard, Rivian, Unilever, and over 60% of the Fortune 500 — rely on Databricks to build and scale data and AI apps, analytics and agents. Headquartered in San Francisco with 30+ offices around the globe, Databricks offers a unified Data Intelligence Platform that includes Agent Bricks, Lakeflow, Lakehouse, Lakebase and Unity Catalog. --- Databricks applicants Please apply through our official Careers page at databricks.com/company/careers. All official communication from Databricks will come from email addresses ending with @databricks.com or @goodtime.io (our meeting tool).
About the Role

P-1348

At Databricks, we are passionate about enabling data teams to solve the world's toughest problems, from security threat detection to cancer drug development. We do this by building and running the world's best data and AI infrastructure platform, so our customers can focus on the high value challenges that are central to their own missions. Our engineering teams build technical products that fulfill real, important needs in the world. We always push the boundaries of data and AI technology, while simultaneously operating with the resilience, security and scale that is important to making customers successful on our platform.

We develop and operate one of the largest scale software platforms. The fleet consists of millions of virtual machines, generating terabytes of logs and processing exabytes of data per day. At our scale, we observe cloud hardware, network, and operating system faults, and our software must gracefully shield our customers from any of the above.

As a Senior Software Engineer working on the Data Platform team you will help build the Data Intelligence Platform for Databricks that will allow us to automate decision-making across the entire company. You will achieve this in collaboration with Databricks Product Teams, Data Science, Applied AI and many more. You will develop a variety of tools spanning logging, orchestration, data transformation, metric store, governance platforms, data consumption layers etc. You will do this using the latest, bleeding-edge Databricks product and other tools in the data ecosystem - the team also functions as a large, production, in-house customer that dog foods Databricks and guides the future direction of the product.

The impact you will have:

  • Design and run the Databricks metrics store that enables all business units and engineering teams to bring their detailed metrics into a common platform for sharing and aggregation, with high quality, introspection ability and query performance.
  • Design and run the cross-company Data Intelligence Platform, which contains every business and product metric used to run Databricks. You’ll play a key role in developing the right balance of data protections and ease of shareability for the Data Intelligence Platform as we transition to a public company.
  • Develop tooling and infrastructure to efficiently manage and run Databricks on Databricks at scale, across multiple clouds, geographies and deployment types. This includes CI/CD processes, test frameworks for pipelines and data quality, and infrastructure-as-code tooling.
  • Design the base ETL framework used by all pipelines developed at the company.
  • Partner with our engineering teams to provide leadership in developing the long-term vision and requirements for the Databricks product.
  • Build reliable data pipelines and solve data problems using Databricks, our partner’s products and other OSS tools. Provide early feedback on the design and operations of these products.
  • Establish conventions and create new APIs for telemetry, debug, feature and audit event log data, and evolve them as the product and underlying services change.
  • Represent Databricks at academic and industrial conferences & events.

What we look for:

  • 6+ years of industry experience 
  • 4+ years of experience providing technical leadership on large projects similar to the ones described above - ETL frameworks, metrics stores, infrastructure management, data security.
  • Experience building, shipping and operating reliable multi-geo data pipelines at scale.
  • Experience working with and operating workflow or orchestration frameworks, including open source tools like Airflow and DBT or commercial enterprise tools.
  • Experience with large-scale messaging systems like Kafka or RabbitMQ or commercial systems.
  • Excellent cross-functional and communication skills, consensus builder.
  • Passion for data infrastructure and for enabling others by making their data easier to access.

 

About Databricks

Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on TwitterLinkedIn and Facebook.

Benefits

At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.

Our Commitment to Diversity and Inclusion

At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.

Compliance

If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.

Key Skills
Data InfrastructureETL FrameworksMetrics StoresInfrastructure ManagementData SecurityData PipelinesWorkflow FrameworksOrchestration FrameworksMessaging SystemsCross-Functional CommunicationData AccessCI/CD ProcessesTest FrameworksInfrastructure-as-CodeTelemetryDebugging
Categories
TechnologySoftwareData & AnalyticsEngineering
Job Information
📋Core Responsibilities
Design and run the Databricks metrics store and the cross-company Data Intelligence Platform. Develop tooling and infrastructure to manage Databricks at scale and partner with engineering teams for long-term vision.
📋Job Type
full time
📊Experience Level
5-10
💼Company Size
15144
📊Visa Sponsorship
No
💼Language
English
🏢Working Hours
40 hours
Apply Now →

You'll be redirected to
the company's application page