Qualification
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field
- 3+ years building analytics-focused backend services in Python, Java, or Go
- Expert SQL skills and experience with columnar/analytical databases (e.g., BigQuery, Snowflake, ClickHouse)
- Hands-on experience with ELT/ETL frameworks (e.g., Airflow, Prefect, dbt)
- Proficient in designing and consuming RESTful or gRPC APIs
- Familiarity with containerization (Docker) and orchestration platforms (Kubernetes)
- Strong understanding of data modeling for analytics (star schemas, slowly changing dimensions)
- Experience with CI/CD pipelines and Git-based workflows
- Excellent problem-solving skills and ability to collaborate with data scientists and analysts
Key Responsibilities
- Build and own microservices and API endpoints that expose aggregated metrics, dashboards data
- Integrate comprehensive logging, metrics collection, and alerting for all analytics pipelines and services to proactively detect and resolve failures.
- Partner with data analysts, data scientists, and product managers to gather requirements, review designs, and translate insights into reliable backend systems. Maintain clear documentation, write unit/integration tests, and conduct peer code reviews
- Troubleshoot incidents, profile performance bottlenecks, and continuously refine pipelines and services for cost-efficiency, scalability, and maintainability across both on-prem and cloud deployments.
- Optimize database performance via partitioning, clustering, and query tuning