Back to search:Senior Data / Jakarta

Responsibilities

  • Manage installation, configuration, upgrades, and patching of Microsoft SQL Server.
  • Design and implement database structures, tables, views, indexes, and stored procedures.
  • Perform performance tuning for queries, indexing, and overall database configuration.
  • Manage backup and recovery strategies, and conduct regular disaster recovery tests.
  • Ensure database security through roles, permissions, and data encryption.
  • Monitor database health and performance using SQL Server native tools (Profiler, Activity Monitor, DMVs) or third-party monitoring tools.
  • Handle incidents and troubleshoot database issues (deadlocks, blocking, slow query performance).
  • Support developers by providing query optimization and data modeling assistance.
  • Maintain high availability using technologies such as Always On, Database Mirroring, Failover Clustering, and Replication.
  • Create and maintain technical documentation related to configurations, procedures, and database standardization.

Qualifications

  • Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or related field.
  • Minimum
    5+ years of experience
    in data engineering or data platform development, preferably in the
    banking, fintech, or financial services
    sector.
  • Proficiency in
    SQL
    and one or more programming languages such as
    Python, Java, or Scala
    .
  • Strong experience with
    ETL/ELT tools
    (e.g., Apache Airflow, Talend, Informatica, or NiFi).
  • Deep understanding of
    data warehouse and lake architectures
    (e.g., Snowflake, BigQuery, Redshift, Hive, Delta Lake).
  • Experience with
    streaming and real-time data processing frameworks
    (e.g., Apache Kafka, Spark Streaming, Flink).
  • Strong knowledge of
    database technologies
    (PostgreSQL, Oracle, MySQL, MongoDB) and
    data modeling techniques
    (dimensional modeling, star schema).
  • Experience with
    cloud-based data solutions
    (AWS, GCP, Azure) — especially storage, compute, and orchestration services.
  • Familiarity with
    containerization and DevOps tools
    (Docker, Kubernetes, CI/CD pipelines).
  • Advanced certifications in Data Engineering, Cloud, or Big Data technologies are a plus (e.g., AWS Certified Data Engineer, GCP Professional Data Engineer).