- Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field.
- Minimum 5 years of experience in data warehousing, ETL development, or data engineering.
- Experience with DevOps tools (e.g., Jenkins, GitLab CI/CD, Docker, Kubernetes, Terraform).
- Strong knowledge of SQL and data modeling concepts (star/snowflake schema).
- Familiarity with cloud environments such as AWS, Azure, or GCP.
- Hands-on experience with ETL tools (e.g., Airflow, dbt, Talend, or Apache NiFi).
- Solid understanding of version control systems, automation, and monitoring tools.