JOB DETAILS
Data Engineer
CompanyQode
LocationVietnam
Work ModeRemote
PostedMay 5, 2026

About The Company
Software Development Outsourcing Expert | Application Development Specialist | Trusted Software Vendor | Outsourced App Development Solutions | Innovative Software Development Partner | Custom application development services | Leading Software Outsourcing Provider | Experienced App Development Team | Reliable Software Development Partner | Strategic Technology Outsourcing Solutions.
Innovate & Dominate with our #1 Software Development Agency.
👉 Codearray is one of the leading companies in Software development, where we have worked with some of the best innovative ideas and brands in the world across industries.
✅ Custom Software Development: We specialize in creating custom software solutions tailored to your business requirements, ensuring efficiency and productivity.
✅ Web and Mobile App Development: Our expertise extends to building responsive and feature-rich web and mobile applications that engage users and provide a higher ROI.
✅ Quality Assurance: Our rigorous testing processes ensure the functionality and reliability of our software. This guarantees a seamless user experience.
✅ Maintenance and Support: We provide ongoing support and maintenance services, ensuring your software remains up-to-date and secure.
Whether you’re a startup aiming to disrupt the market or an established enterprise seeking digital transformation, CodeArray is here to turn your vision into impactful software solutions.
Join hands with us and experience the difference of working with a leading software development agency. Together, let’s shape the future of technology for your business.
Top Review
👉 CodeArray developed a sophisticated social networking mobile application for us. The project was complex and involved lots of moving parts and minute details. CodeArray demonstrated a high degree of customer-centricity and flexibility during the project with very quick response times and an accommodative approach. We highly recommend the company for complex applications.
About the Role
Data EngineerLocation: VietnamWorkplace Type: Remote
About the RoleOur client is looking for a Data Engineer to join their Data Platform team, focusing on building scalable data pipelines and enabling analytics across the organization. In this role, you will work with modern data stack tools like Databricks, AWS, Airflow, Airbyte, and dbt to design and maintain data workflows that support reporting, analytics, and data-driven decisions.
Responsibilities
Requirements
Benefit Packages:
About the RoleOur client is looking for a Data Engineer to join their Data Platform team, focusing on building scalable data pipelines and enabling analytics across the organization. In this role, you will work with modern data stack tools like Databricks, AWS, Airflow, Airbyte, and dbt to design and maintain data workflows that support reporting, analytics, and data-driven decisions.
Responsibilities
- Design and build scalable ETL/ELT pipelines using both batch and streaming approaches.
- Develop ingestion workflows from multiple sources such as databases, APIs, and event streams.
- Implement ingestion strategies including full load, incremental load, and CDC.
- Orchestrate data workflows using Apache Airflow.
- Manage data connectors using Airbyte.
- Work with Databricks Lakehouse to build and optimize data processing pipelines.
- Write and optimize complex SQL queries for analytics and transformation.
- Build modular and testable data models using dbt (staging → intermediate → marts).
- Maintain data quality, observability, and reliability across the platform.
- Work with AWS services such as S3, Lambda, EC2, IAM.
- Containerize data services using Docker and Kubernetes (EKS) when needed.
- Document pipelines, data models, and data dictionaries for long-term maintainability.
Requirements
- At least 5 years of experience in Data Engineering.
- Strong understanding of data architectures such as Data Lake, Data Warehouse, and Lakehouse.
- Hands-on experience with ETL/ELT pipelines, including batch and streaming processing.
- Familiar with ingestion patterns: full load, incremental, CDC, event-driven.
- Experience working with Databricks (Delta Live Tables, Jobs, Notebooks).
- Strong skills in PySpark or Spark SQL for large-scale data processing.
- Solid understanding of Delta Lake (ACID, time travel, schema evolution).
- Experience with Apache Airflow (DAGs, scheduling, monitoring).
- Experience with Airbyte or similar ingestion tools.
- Strong SQL skills (CTEs, joins, window functions, query optimization).
- Experience with dbt for transformation, testing, and documentation.
- Hands-on experience with AWS (S3, Lambda, IAM, etc.).
- Experience with Docker, Kubernetes (EKS).
- Experience running Airflow or Airbyte on Kubernetes.
- Familiar with data quality tools such as Great Expectations or Soda.
- Experience with Terraform or Infrastructure as Code.
- Exposure to data governance or catalog tools (e.g., Databricks Catalog).
- Experience with CI/CD pipelines (e.g., GitHub Actions).
- Strong Python skills for automation and pipeline scripting.
Benefit Packages:
- Attractive salary range and we are open to negotiate if you're a strong fit.
- Hybrid/Remote-friendly culture, work where you grow best.
- Flexible hours, async teamwork (we respect your focus time).
- Work equipment support.
- Allowance for Certification & Skill Development.
- Year-end bonus & performance-based rewards.
- 15 paid leaves a year.
- Career growth with personal coaching sessions.
- Open, collaborative team culture - no micromanagement, only trust.
- Tools & AI-powered workflows that make remote work easier.
Key Skills
DatabricksAWSApache AirflowAirbytedbtPySparkSQLETL/ELTDelta LakeDockerKubernetesPythonData ModelingCDCData LakehouseTerraform
Categories
Data & AnalyticsTechnologySoftwareEngineering
Benefits
Attractive salary rangeHybrid/Remote-friendly cultureFlexible hoursWork equipment supportAllowance for Certification & Skill DevelopmentYear-end bonusPerformance-based rewards15 paid leaves a yearCareer growth with personal coaching sessionsOpen, collaborative team cultureAI-powered workflows
Job Information
📋Core Responsibilities
Design and build scalable ETL/ELT pipelines using batch and streaming approaches to support organizational analytics. Maintain data quality and reliability using tools like Databricks, Airflow, and dbt within an AWS environment.
📋Job Type
full time
📊Experience Level
5-10
💼Company Size
26
📊Visa Sponsorship
No
💼Language
English
🏢Working Hours
40 hours
Apply Now →
You'll be redirected to
the company's application page