Job description
Description
& Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics.
We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology.
We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.
Responsibilities:
· Design and implement ETL/ELT pipelines using AWS services such as Lambda, Glue, EMR, and Step Functions · Develop scalable data processing workflows using Python and/or PySpark · Work with AWS Data Lake and Data Warehouse (e.g., S3, Redshift, Athena, Lake Formation) to ingest, transform, and store structured and semi-structured data · Optimize data pipelines for performance, reliability, and cost-efficiency · Collaborate with cross-functional teams in a Scrum/Agile setup to deliver sprint goals · Ensure data quality, lineage, and governance through automated validation and monitoring · Maintain and enhance CI/CD pipelines for data workflows · Document technical designs, data flows, and operational procedures · Develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud · Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes · Works with customers to deploy, manage, and audit standard processes for cloud products.
Mandatory skill sets:
AWS Lambda, Glue, S3, Hands on experience with Pyspark/Python on AWS,
Preferred skill sets:
· Other AWS services like Redshift, Athena, and other services · strong understanding of data lake and data warehouse architectures on AWS · Experience with Airflow, Terraform, or CloudFormation · Certification on AWS
Years of experience required:
3 to 12 years
Education qualification:
BE, B.Tech, ME, M,Tech, MBA, MCA (60% above)
Education
Degrees/Field of Study required: Bachelor of Technology, MBA (Master of Business Administration), Bachelor of EngineeringDegrees/Field of Study preferred:
Certifications
Required Skills
Data Engineering
Optional Skills
Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more}
Desired Languages
Travel Requirements
Available for Work Visa Sponsorship?
Government Clearance Required?
Job Posting End Date
Required Skill Profession
Computer Occupations