Mandatory :
- Minimum of 4 to 6 years of relevant work experience as AWS Developer
- Good knowledge of Hadoop Architecture and its ecosystem and possess experience in data storage HDFS, writing queries HQL or Spark SQL, data processing and analysis using Pyspark
- Strong hands on Experience in AWS Big data tools EMR, Glue, Athena, MSK/Kinesis, IAM, EC2, S3
- Strong hands on Experience in AWS Databases -RDS, Dynamo DB, Redshift/Spectrum,
- Strong hands on Python script programming using Jupiter notebook
- Experience of version control tools like Git, TFS/Bit Bucket
- Knowledge on AWS Aurora, Neptune, SNS, SQS, Redis, Cloud formation, Lambda, VPC, Glacier, EBS, EFS, Cloudwatch
- Knowledge of CICD, Docker, Terraform, RabbitMQ/Apache Kafka
- Working Knowledge of Presto DB, Apache Spark, Apache Hive, Apache Hudi and Delta tables
- Certified in AWS Solution Architect
Good To Have :
- Nice to have knowledge of AWS Data lake formation
- Hive, Presto, Flink connectors.
- Knowledge of Medical domain ( Dicom,HL7, FHIR)
Skills Required
Aws, Python, Git, Sqs, Redis, Cloud Formation, Lambda, Vpc, Glacier, EBS