Back to search:Data Engineer / Jakarta

DATA ENGINEER (Hybrid, Jakarta)

Office- > Equity Tower, Jendral Sudirman Kav SCBD, Kebayoran Baru Jakarta Selatan, 12190

About the Role

Our client, a
leading regional cloud and AI transformation partner
, helps enterprises modernize their data infrastructure and implement advanced analytics and AI solutions across Asia. With a strong presence in Southeast Asia and over 1,200 enterprise customers, our client specializes in
data modernization, cloud migration, and AI-driven business intelligence
.

We are seeking a
Data Engineer
experienced in designing and implementing end-to-end data pipelines on cloud platforms, particularly GCP or Azure. This role will work on a large-scale retail data modernization project involving
data integration, transformation, and optimization
across multiple sources and environments.

Location:
Jakarta (Hybrid working arrangement)

Key Responsibilities

  • Design, build, and maintain
    batch and streaming data pipelines
    using GCP (BigQuery, Dataflow, Dataproc, Composer, Dataform, Cloud Functions).
  • Perform
    ETL/ELT operations
    to load and optimize data in BigQuery for analytics and reporting.
  • Integrate data from
    APIs, databases, and file-based systems
    across multiple environments.
  • Support
    data migration
    from legacy systems (Oracle, MicroStrategy, etc.).
  • Ensure
    data governance, data quality, and compliance
    with organizational standards.
  • Collaborate with business intelligence and analytics teams to support reporting and dashboard needs.

Requirements

  • 3–5 years of experience in
    data engineering or ETL development
    .
  • Proven experience with
    GCP Data Stack
    (BigQuery, Dataflow, Dataproc, Composer).
  • Strong skills in
    SQL and Python
    for data transformation and automation.
  • Familiarity with
    Azure Data Stack
    (Data Factory, Databricks, Synapse Analytics, Data Lake) is a plus.
  • Understanding of
    data modeling, performance tuning
    , and
    orchestration tools
    (Airflow or dbt).
  • Exposure to
    data migration or modernization projects
    preferred.
  • Strong problem-solving mindset, collaborative, and proactive learner.

Nice to Have

  • GCP Professional Data Engineer or Cloud Architect certification.
  • Experience with
    CI/CD pipelines
    ,
    CDC (Change Data Capture)
    , or
    data governance frameworks
    .

Skills Must-Have

GCP BigQuery, Dataflow, Dataproc, Composer, Dataform, Python, SQL, ETL/ELT, Airflow/dbt, Data Modeling, Data Migration