Description :
- Design and implement scalable, secure, and high-performance data architectures, pipelines, and integrations across diverse systems.
- Drive end-to-end execution of data engineering initiatives, from requirements gathering to deployment and monitoring.
- Improve data workflows for efficiency, scalability, and cost-effectiveness using modern tools and cloud technologies.
- Partner with data scientists, analysts, and business leaders to deliver solutions that enable analytics, reporting, and AI initiatives.
- Guide data engineers, provide technical leadership, review code, and foster best practices in data engineering.
- Evaluate, research and introduce new data tools, frameworks, and methodologies to keep the organization's data stack modern and competitive.
Requirement :
- Bachelor's or Master's degree in Computer Science, Information Technology, Information Systems, or related fields.
- Expertise in designing scalable data platforms, data warehouses, and data lakes.
- Advanced proficiency with AWS (Redshift, Glue, EMR, S3), GCP (BigQuery, Dataflow, Pub/Sub), or Azure (Synapse, Data Factory, Databricks).
- Advanced Python, Java/Scala, and SQL for building and optimizing data workflows.
- Translate technical concepts into business language for stakeholders, ensuring alignment between data solutions and business needs.
- Guide junior engineers by reviewing code, sharing best practices, and supporting skill development.
- Work effectively with cross-functional teams (analysts, data scientists and influence technical decisions.