We are seeking a skilled
Data Engineer
to design, build, and maintain reliable data pipelines and infrastructure that power our analytics and business intelligence systems.
You will collaborate closely with data analysts, backend engineers, and product teams to ensure data accuracy, accessibility, and performance across the organization.
Key Responsibilities:
- Design, develop, and maintain
ETL/ELT pipelines
for ingesting and transforming structured and unstructured data.
- Build and optimize
data models
and
data warehouses
to support analytics and reporting needs.
- Integrate data from multiple sources (databases, APIs, third-party platforms) into centralized storage systems.
- Ensure
data quality, reliability, and scalability
through validation and monitoring.
- Collaborate with stakeholders to define data requirements and deliver meaningful datasets.
- Implement
data governance, documentation, and version control
best practices.
- Work closely with DevOps to deploy data solutions in cloud environments (AWS, GCP, Azure).
- Troubleshoot and optimize data pipelines for performance and cost efficiency.
Requirements:
- Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field.
- 3 years
of experience in data engineering or backend data-related roles.
- Strong proficiency in
SQL
and one or more programming languages such as
Python
or
Go
- Hands-on experience with
data pipeline tools
(Airflow, Prefect, Luigi, or similar).
- Experience with
cloud data warehouses
(BigQuery, Redshift, Snowflake, or Azure Synapse).
- Solid understanding of
data modeling
,
ETL design
, and
data architecture principles
- Familiarity with
Docker
,
Kubernetes
, and CI/CD workflows.
- Hands-on experience with
Tableau.
- Experience with
streaming technologies
(Kafka, Pub/Sub, Kinesis) is a plus.
- Knowledge of
data security
,
access control
, and
governance standards
- Experience working with
Golang
for data processing or backend systems.(Nice to have)
- Familiarity with
dbt (data build tool)
or
Terraform
(Nice to have)
- Exposure to
machine learning pipelines
or collaboration with Data Scientists.(Nice to have)
- Experience in
Agile/Scrum
environments.
- Analytical and detail-oriented, with strong problem-solving skills.
- Ability to translate business requirements into technical data solutions.
- Excellent communication and collaboration skills.
- Continuous learner with a growth mindset and proactive attitude.