JOB DETAILS
Data Engineer - Systematic Commodities Hedge Fund
CompanyMoreton Capital Partners
LocationCiudad de México
Work ModeOn Site
PostedFebruary 6, 2026

About The Company
We extract alpha by combining deep fundamental trading experience with cutting-edge quantitative research.
While the machine handles complex pattern recognition with speed and precision, human oversight ensures strategic decision-making and adaptability to shifting market conditions and regime changes, allowing us to capture real-time opportunities and manage risk effectively.
About the Role
Data Engineer - Systematic Commodities Hedge Fund
Moreton Capital Partners is a systematic commodities hedge fund preparing to launch live trading across global futures markets. Our research and trading systems rely on robust, scalable data infrastructure. We are looking for Data Engineers to help us design, build, and optimize that infrastructure alongside senior engineers and the CIO.
Key Responsibilities
You’ll work on projects such as:
- Designing and maintaining data pipelines to collect, clean, and transform market and alternative datasets (e.g., futures, options, weather, satellite, fundamentals).
- Building ETL workflows using Python (pandas/polars) and orchestration tools such as Airflow or Prefect.
- Structuring data warehouses and APIs (SQL, Snowflake, or similar) for efficient query and analysis.
- Developing data quality and monitoring systems for latency, completeness, and integrity.
- Assisting in cloud deployments (AWS, Docker) and automation for data ingestion and versioning.
- Collaborating with Quant Researchers to make research datasets reproducible and production-ready.
- Contributing to internal documentation and code standards to ensure long-term maintainability.
- Strong programming skills in Python and familiarity with SQL.
- Understanding of data structures, algorithms, and software engineering best practices.
- Interest in large-scale data systems, cloud computing, or distributed processing.
- Self-starter with curiosity and attention to detail.
Bonus points for:
- Experience with Airflow, Docker, or AWS.
- Familiarity with Snowflake, Polars, or Pandas workflows.
- Exposure to financial or time-series data.
- Understanding of CI/CD, version control, or testing frameworks.
- Real-world impact: Help build data systems that directly feed institutional-grade trading research and live execution.
- Technical depth: Gain hands-on experience with distributed data pipelines, cloud infrastructure, and production data engineering.
- Mentorship: Work closely with senior engineers, the CIO, and Quant Researchers on live projects.
- Collaborative culture: Inclusive, high-trust team that values initiative and learning.
- Compensation: Competitive stipend/salary based on experience.
Key Skills
PythonSQLData PipelinesETLAirflowCloud ComputingData WarehousingData QualityMonitoring SystemsAutomationVersion ControlDistributed ProcessingFinancial DataTime-Series DataDocumentationCollaboration
Categories
TechnologyFinance & AccountingData & AnalyticsSoftware
Job Information
📋Core Responsibilities
You will design and maintain data pipelines to collect and transform various datasets, and build ETL workflows using Python and orchestration tools. Additionally, you will collaborate with Quant Researchers to ensure datasets are production-ready.
📋Job Type
contractor
📊Experience Level
2-5
💼Company Size
8
📊Visa Sponsorship
No
💼Language
English
🏢Working Hours
40 hours
Apply Now →
You'll be redirected to
the company's application page