We are looking for a skilled Data Engineer (Python / Apache Airflow) to support our logistics client in building and maintaining large-scale data pipelines. This position involves developing ETL processes using Python and Airflow, managing complex SQL queries, and integrating data with ERP systems such as Odoo. The ideal candidate has strong technical expertise in data engineering and is comfortable working with high-frequency transactional data in a fast-paced logistics environment.
Job Description:
- Design, develop, and maintain data pipelines and ETL workflows using Apache Airflow and Python.
- Process and integrate large-scale transactional data from multiple platforms (e.g., Genesis and Odoo).
- Write and optimize raw SQL queries to handle high-volume data efficiently (hundreds of thousands to millions of records daily).
- Automate the transfer and transformation of data between systems to ensure seamless information flow across internal and external systems.
- Collaborate with cross-functional teams to translate business requirements into scalable data processes.
- Manage and version control code using Git, and handle deployment using Docker.
- Monitor data integrity, troubleshoot workflow issues, and ensure accurate synchronization between systems.
Requirements:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Minimum of 3 years of professional experience in Python development focusing on data processing or ETL pipelines.
- Strong experience with SQL (PostgreSQL and/or MySQL).
- Hands-on experience with Apache Airflow for at least 1 year is highly preferred.
- Familiarity with Docker, Git, and CI/CD practices.
- Understanding of Odoo ERP systems is a strong plus.
- Strong problem-solving skills and attention to data accuracy.
- Available to start work as soon as possible is highly preferred.