Core Responsibilities
- End-to-End Development: Design, develop, and maintain the entire application stack, including the frontend UI, backend services, APIs, and database.
- AI Model Integration: Build and maintain scalable RESTful or gRPC APIs (e.g., using FastAPI, Flask, or ) to serve predictions and outputs from our production AI/ML models.
- Backend Services: Develop robust backend logic to handle business processes, user authentication, data management, and asynchronous tasks.
- Asynchronous Processing: Implement and manage background job queues (e.g., Celery, RabbitMQ, Kafka) to handle long-running AI inference tasks without blocking the user interface.
- Frontend Development: Translate UI/UX designs and wireframes into high-quality, responsive code using modern frameworks (e.g., React, Vue, or Angular).
- Data Visualization: Build interfaces that can clearly and intuitively visualize complex AI outputs, predictions, and confidence scores for the end-user.
- Database Management: Design and optimize database schemas (both SQL and NoSQL) to store user data, application state, and model-generated results.
- AI Feedback Loops: Engineer systems to capture user interactions and feedback (e.g., "thumbs up/down," corrections) to create data pipelines for model retraining and improvement.
- DevOps & Deployment: Containerize the application (e.g., Docker) and assist in deploying and monitoring the full stack on cloud platforms (AWS, GCP, Azure) using CI/CD pipelines.
Qualifications required Experience:
- Experience: 3-5+ years of professional experience in full-stack development.
- Backend Proficiency: Strong proficiency in Python (with frameworks like FastAPI, Flask, Django) or (Express). Python is strongly preferred due to its dominance in the AI ecosystem.
- Frontend Proficiency: Expertise in a modern JavaScript framework such as React, , or Angular.
- Database Expertise: Proven experience with both relational (e.g., PostgreSQL, MySQL) and non-relational (e.g., MongoDB, Redis) databases.
- API Design: Deep experience in building, consuming, and documenting RESTful APIs.
- DevOps Fundamentals: Strong knowledge of Git, CI/CD pipelines, and containerization (Docker).
- Collaboration: Excellent communication skills and a proven ability to collaborate effectively with technical and non-technical stakeholders (especially data scientists). Problem-Solving: Strong analytical and problem-solving skills, with the ability to simplify complex concepts.
Educational Background:
- AI/ML Exposure: Prior experience working on a team with data scientists or ML engineers.
- Async Tools: Hands-on experience with message brokers (RabbitMQ, Kafka) and task queues (Celery).
- Cloud Platforms: Experience with AWS, Google Cloud (GCP), or Azure, especially their AI/ML services (e.g., SageMaker, Vertex AI).
- Model Serving: Familiarity with model serving tools (e.g., TensorFlow Serving, TorchServe, Kserve).
- Data Engineering: Basic understanding of data pipelines and ETL processes.