Company Description
Ajari Technologies
is a forward-thinking technology company dedicated to building impactful and innovative digital solutions. We combine cutting-edge technology with a human-centered approach to create relevant, efficient, and sustainable products and services.
Founded with a vision to drive digital transformation across industries, Ajari Technologies has worked with businesses, educational institutions, and social organizations to develop customized digital platforms, tech-enabled learning systems, and smart applications tailored to their needs.
We believe that technology is not just about sophistication, but about creating real value and solving modern-day challenges. With a multidisciplinary team of developers, designers, and strategic thinkers, Ajari Technologies continues to innovate for a better, smarter future.
Role Description
This is a full-time on-site role for a
Data Engineer
located in the Jakarta Metropolitan Area. The Data Engineer will be responsible for designing, implementing, and maintaining scalable data pipelines and ETL processes. Day-to-day tasks include data modeling, building data warehousing solutions, and conducting data analytics to support various business needs.
Responsibilities
- Design, build, and maintain scalable and efficient ETL/ELT pipelines across various data sources.
- Develop and manage data architectures, including data lakes and data warehouses.
- Collaborate with product, analytics, and engineering teams to gather requirements and deliver data solutions.
- Ensure data quality, security, and compliance with internal and external policies.
- Support the integration of third-party data platforms and APIs.
- Continuously improve data workflows and infrastructure for scalability and performance.
- Monitor data systems and resolve data-related issues in a timely manner.
Qualification
- Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
- 2+ years of experience as a Data Engineer or in a similar role.
- Proficiency in SQL and at least one programming language (preferably Python).
- Hands-on experience with cloud platforms (AWS, GCP, or Azure).
- Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, BigQuery, or Redshift).
- Knowledge of data modeling, ETL design patterns, and pipeline orchestration.
- Strong problem-solving skills, attention to detail, and a proactive mindset.
- Fluent in English is preferred.