Position:
Data Engineer – Snowflake & Cloud
Location:
Jakarta Selatan, Indonesia (Equity Tower, SCBD)
Contract:
8–12 Months
Notice Period:
Immediate / 2 weeks
About Our Client:
Our client is a fast-growing, innovative company in the data and analytics space, focused on building scalable cloud-based data platforms to empower business and analytics teams. They leverage modern Snowflake architecture and cloud infrastructure to drive data-driven decision-making.
Role Overview:
We are looking for an experienced Data Engineer with strong hands-on expertise in
Snowflake
and cloud platforms (AWS, Azure, or GCP). You will design, implement, and maintain scalable data pipelines, optimize Snowflake performance, and integrate cloud-based datasets to support analytics and business teams.
Key Responsibilities:
- Design, build, and maintain ETL/ELT pipelines into Snowflake from multiple data sources.
- Develop and manage Snowflake warehouses, schemas, roles, tasks, Snowpipe, stages, and data marts.
- Integrate Snowflake with cloud data storage (AWS S3, Azure Blob Storage, or GCP Cloud Storage).
- Use cloud services (AWS Glue, Azure Data Factory, GCP Cloud Composer, Lambda/Functions) to automate and orchestrate data loading.
- Optimize query performance, warehouse sizing, credit usage, clustering, and partitioning.
- Design and maintain data models (star and snowflake schema) and data marts.
- Implement monitoring, logging, data validation, and error-handling across pipelines.
- Ensure data governance, security (RBAC/IAM), and documentation best practices.
- Collaborate with analysts, data scientists, and business teams to deliver clean, reliable datasets.
Must-Have Requirements:
- 3–5 years of experience as a Data Engineer or Data Warehouse Engineer.
- Strong hands-on experience with Snowflake (Snowpipe, virtual warehouses, stages, roles, performance tuning).
- Experience with at least one cloud platform — AWS, Azure, or GCP.
- Proficient in SQL and one scripting language (Python preferred).
- Strong understanding of data warehousing, ETL/ELT processes, and data modeling.
Good to Have:
- Experience with MSSQL, PostgreSQL, or other RDBMS.
- Familiarity with Airflow, dbt, SSIS, Azure Data Factory, AWS Glue, or similar orchestration tools.
- Knowledge of Terraform / CloudFormation for automated deployment.
- Experience with BI tools such as Power BI, Tableau, or Looker.
- Basic understanding of data governance, masking, encryption, or GDPR compliance.
Soft Skills:
- Strong analytical and problem-solving skills.
- Detail-oriented and documentation-focused.
- Able to communicate clearly with both technical and non-technical teams.
- Independent, proactive, and collaborative.
Why Join:
- Work on modern Snowflake + cloud data infrastructure.
- Opportunity to lead data platform improvements and migrations.
- Collaborative culture with high ownership and innovation space.
Must-Have Skills:
Snowflake (Snowpipe, virtual warehouses, stages, roles, performance tuning), AWS/Azure/GCP, SQL, Python, ETL/ELT, Data Modeling, Data Warehousing