Responsibilities
- Build and maintain reliable ETL pipelines to support data-driven operations and projects.
- Manage and optimize data flow across multiple platforms (PostgreSQL, Oracle, MongoDB, etc.).
- Develop scalable solutions for real-time and batch data processing using tools like Kafka, Flink, or Spark.
- Collaborate closely with data analysts, scientists, and business teams to deliver accessible, high-quality data.
- Design and maintain data models, schemas, and documentation for analytics and reporting.
- Integrate new data sources and APIs (REST-based) to enhance system capabilities.
- Implement data quality checks, monitor pipeline performance, and troubleshoot issues proactively.
- Support event-based streaming and CDC processes using tools such as Debezium or equivalent.
- Explore and adopt new technologies to improve data workflows and performance.
Required Skills and Experience
- Minimum education is a bachelor's degree (S1)
- 3+ years of work experience (specialized in Data is preferred)
- At least 3 years of hands-on experience with SQL
- Proven skills in ETL - SSIS / Talend / Kettle / Pentaho, Debezium, Kafka, Flink, Spark, PostgreSQL, Oracle Database, SQL, NoSQL, GSQL, TG, Python, MongoDB
- Knowledge of ETL, CDC, Event-Based Streaming, GraphDB, REST API, Python, Basic Programming
- Eager and willing to learn new technologies
Preferred Qualifications (if any)
- Experience in the insurance industry is a plus
Work Arrangement
- Placement: South Jakarta.
- Work setup: Work from Office (WFO).
- Contract duration: 6 months with possibility of extension.
Notes
- Start date: Available ASAP (1 month notice period is acceptable)
- Join date preference: the sooner, the better