Job Descriptions:
- Design, build, and maintain pipelines from diverse sources (transactional DBs, APIs, files, third parties).
- Orchestrate scheduled jobs (batch/near-real time), managing dependencies and retries.
- Design warehouse schemas and domain data marts (sales, marketing, ops).
- Partner with BI/Marketing/Finance to define KPIs, metric definitions, and prototype dashboards.
- Optimize dashboard performance (query tuning, extracts/aggregations, materialized views).
Requirements:
- Bachelor of Informatics Engineering, Computer Science, or equivalent.
- Minimum 2 years work experience.
Skill or competencies:
- Strong SQL (CTEs, window functions, query optimization).
- Experience with cloud data warehouses (BigQuery preferred; Redshift/Snowflake a plus).
- Hands-on with with ETL/ELT & orchestration (Airflow/Cloud Composer, dbt, or Airbyte/Fivetran).
- Understanding of data modeling, partitioning/clustering, and cloud DW cost control.
- Familiarity with BI tools (Looker/Looker Studio/Tableau/Power BI) and dashboard best practices.