Backend Engineer - AI  
Experience:  3 - 5 years 
Location:  Bangalore / Gurugram 
Availability:  This is an urgent requirement - Immediate joiners preferred  
Overview  
Design and ship Python-based backends that power AI-driven applications.
You'll work closely with AI engineers (LLM, RAG, and agent teams) to turn prototypes into secure, scalable, and observable production systems.
The role involves building APIs, workers, and event pipelines that integrate seamlessly with enterprise environments.
Key Responsibilities  
- API Development:  Build and maintain REST/gRPC APIs using FastAPI  (preferred) or Flask + Pydantic , leveraging asyncio  for high-performance I/O.
 
 
- Background Jobs & Eventing:  Implement schedulers and workers (Celery/RQ/Arq ) and event pipelines (Kafka/RabbitMQ/Azure Service Bus ).
 
 
- Data Modeling:  Design and tune schemas with SQLAlchemy 2.x  and Alembic , optimize PostgreSQL performance, and implement caching using Redis .
 
 
- AI Integration:  Wrap AI components (LLM endpoints, RAG services, tool/function calling) behind stable, observable interfaces with streaming and timeout management.
 
 
- Enterprise Integration:  Connect to enterprise systems (SAP, Salesforce, ServiceNow, Workday) with OAuth2/OIDC , error handling, and idempotency safeguards.
 
 
- Security & Compliance:  Implement identity and access management , Key Vault/Secrets Manager , input validation (JSON Schema), and maintain audit logs.
 
 
- Operational Excellence:  Own deployment, monitoring, feature flags, blue-green/canary releases, incident response, and post-mortems with clear documentation.
 
 
Must-Have Qualifications  
- 3-5 years of backend development experience using Python .
 
 
- Expertise in FastAPI  (preferred) or Flask + Pydantic  with a strong understanding of OpenAPI/Swagger  design.
 
 
- Solid knowledge of async programming , concurrency control, and connection management.
 
 
- Hands-on experience with PostgreSQL  (schema design, query optimization) and Redis  caching.
 
 
- Experience with Kafka , RabbitMQ , or Azure Service Bus  for messaging/eventing.
 
 
- Proficiency in Docker  and at least one cloud provider (Azure preferred ; AWS/GCP acceptable).
 
 
- Familiarity with CI/CD  (GitHub Actions or Azure DevOps) and IaC tools (Terraform/Bicep).
 
 
- Strong testing  practice: pytest, fixtures, contract/load testing, mocks/stubs.
 
 
- Experience with observability tools  (OpenTelemetry, App Insights, Prometheus, Grafana).
 
 
- Good understanding of security principles  (OAuth2/OIDC, JWT, mTLS, rate limiting, and input/output validation).
 
 
Good-to-Have Skills  
- Experience building streaming chat endpoints  (SSE/WebSockets) and function/tool-calling adapters for AI services.
 
 
- Exposure to vector databases  (Azure AI Search vector, Pinecone, Weaviate, Qdrant) and content-safety  integrations.
 
 
- Knowledge of multi-tenant controls , policy-as-code (OPA), and usage metering.
 
 
- Familiarity with gRPC , Dapr , and feature flag systems (LaunchDarkly, Flipt).
 
 
- Enterprise system integration experience (SAP OData/BAPI, Salesforce REST/Graph, ServiceNow).
 
 
Core Tech Stack  
- Python 3.11+, FastAPI, Pydantic v2, SQLAlchemy 2.x, Alembic, pytest 
- PostgreSQL, Redis, Kafka/RabbitMQ/Azure Service Bus 
- Docker, GitHub Actions/Azure DevOps, Terraform/Bicep 
- OpenTelemetry, App Insights/Prometheus/Grafana 
- Auth & Security: OAuth2/OIDC (Entra ID/other), Key Vault/Secrets Manager, API Gateway/WAF