Experience  : 5 to 8 years  
  
Job description:   Python AWS Data Engineer  
Mandatory skills - python, pyspark, who can write codes, any cloud exp - aws/ gcp / azure.
  
  
- Python, AWS Python (core language skill) -- Backend, Pandas, PySpark (DataFrame API), interacting with AWS (e.G., boto3 for S3, Glue, Lambda)  
- Data Processing: Spark (PySpark), Glue, EMR AWS Core Services: S3, Glue, Athena, Lambda, Step Functions, EMR  
- Containerization: Docker  
- Orchestration: Kubernetes (EKS)  
- Infrastructure as Code: Terraform, CloudFormation CI/CD Pipelines Data Formats & Processing: parquet, json, csv, SQL/PostgreSQL, Pandas  
- Version Control: Git Cloud ML Services: SageMaker (basic/optional) Problem-solving & Agile working environment Exposure AWS Sagemaker Endpoints (development, deployment, scaling, end-to-end ML workflows) Exposure to Large language models (LLM)  
- Java knowledge would be added advantage  
  
Timing:   2 PM - 11 PM IST ( 4 hours overlap with EST)