• Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role.
India Jobs Expertini

Data Engineer _ AWS + Python Job Opening In Chennai – Now Hiring Fractal


Job description

It's fun to work in a company where people truly BELIEVE in what they are doing!

Data Analytics Engineer- AWS at Fractal.ai

Fractal is one of the most prominent players in the Artificial Intelligence space.

Fractal's mission is to power every human decision in the enterprise and brings AI, engineering, and design to help the world's most admired Fortune 500® companies.

Fractal has more than 3,000 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia.

Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, and recognized as an Honorable Vendor in 2021 Magic Quadrant™ for data & analytics by Gartner.

Experience: 3-5 years

Location: Pan (India)

Responsibilities:

As a Data Engineer, you will be responsible implementing complex data pipelines and analytics solutions to support key decision-making business processes in our client’s domain.

You will gain exposure to a project that is leveraging cutting edge AWS technology that applies Big Data and Machine Learning to solve new and emerging problems for our clients.

You will gain a added advantage of working very closely with AWS Professional Services teams directly executing within AWS Services and Technologies to solve complex and challenging business problems for Enterprises.

Key responsibilities include:

- Work closely with Product Owners and AWS Professional Service Architects to understand requirements, formulate solutions, and implement them.

- Implement scalable data transformation pipelines as per design - Implement Data model and Data Architecture as per laid out design.

- Evaluate new capabilities of AWS analytics services, develop prototypes, and assist in drawing POVs, participate in design discussions

Requirements:

• Minimum 3 years’ experience implementing transformation and loading of data from a wide variety of traditional and non-traditional sources such as structured, unstructured, and semi structured using SQL, NoSQL and data pipelines for real-time, streaming, batch and on-demand workloads

• At least 2 years implementing solutions using AWS services such as Lambda, AWS Athena and Glue AWS S3, Redshift, Kinesis, Lambda, Apache Spark,

• Experience working with data warehousing data lakes or Lakehouse concepts on AWS

• Experience implementing batch processing using AWS Glue/Lake formation, & Data Pipeline

• Experience in EMR/MSK

• Experience or Exposure to AWS Dynamo DB will be a plus

• Develop object-oriented code using Python, besides PySpark, SQL and one other languages (Java or Scala would be preferred)

• Experience on Streaming technologies both OnPrem/Cloud such as consuming and producing from Kafka, Kinesis

• Experience building pipelines and orchestration of workflows in an enterprise environment using Apache Airflow/Control M

• Experience implementing Redshift on AWS or any one of Databricks on AWS, or Snowflake on AWS

• Good understanding of Dimensional Data Modelling will be a plus.

• Ability to multi-task and prioritize deadlines as needed to deliver results

• Ability to work independently or as part of a team

• Excellent verbal and written communication skills with great attention to detail and accuracy

• Experience working in an Agile/Scrum environment

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?

Let us know you're interested in a future opportunity by clickingin the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Required Skill Profession

Computer Occupations


  • Job Details

Related Jobs

Fractal hiring Data Engineer _ AWS + Python Job in Mumbai, Maharashtra, India
Fractal
Mumbai, Maharashtra, India
Fractal hiring Data Engineer _ AWS + Python Job in Pune, Maharashtra, India
Fractal
Pune, Maharashtra, India
Fractal hiring Data Engineer _ AWS + Python Job in Noida, Uttar Pradesh, India
Fractal
Noida, Uttar Pradesh, India
Fractal hiring Data Engineer _ AWS + Python Job in Bengaluru, Karnataka, India
Fractal
Bengaluru, Karnataka, India
Fractal hiring Data Engineer _ AWS + Python Job in Gurugram, Haryana, India
Fractal
Gurugram, Haryana, India
Confidential hiring Python+AWS Data Engineer Job in Pune, Maharashtra, India
Confidential
Pune, Maharashtra, India
MLOPS Solutions Private Limited hiring AWS Data Engineer - Python/PySpark Job in India
MLOPS Solutions Private Limited
India
Servhigh Global Services Private Limited hiring Senior Data Engineer - AWS/Python Job in Noida, Uttar Pradesh, India
Servhigh Global Services Private Limited
Noida, Uttar Pradesh, India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Delhi Division, Delhi, India
Mobile Programming LLC
Delhi Division, Delhi, India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Chandigarh, Chandigarh, India
Mobile Programming LLC
Chandigarh, Chandigarh, India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Mumbai, Maharashtra, India
Mobile Programming LLC
Mumbai, Maharashtra, India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Gurugram, Haryana, India
Mobile Programming LLC
Gurugram, Haryana, India
intineri infosol pvt Ltd hiring AWS Data Engineer - SQL/Python Job in India
intineri infosol pvt Ltd
India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Chennai, Tamil Nadu, India
Mobile Programming LLC
Chennai, Tamil Nadu, India
Rapsys Technologies hiring AWS Data Engineer - SQL/Python Job in Pune, Maharashtra, India
Rapsys Technologies
Pune, Maharashtra, India
Zorba Consulting India Pvt. Ltd. hiring AWS Data Engineer - Python/PySpark Job in Hyderabad, Telangana, India
Zorba Consulting India Pvt. Ltd.
Hyderabad, Telangana, India
Confidential hiring DATA ENGINEER - AWS, PYTHON, TERRAFORM Job in Gurugram, Haryana, India
Confidential
Gurugram, Haryana, India
The Nielsen Company hiring Data Engineer - Python/SQL/AWS Job in Bengaluru, Karnataka, India
The Nielsen Company
Bengaluru, Karnataka, India
The Nielsen Company hiring Data Engineer - Python/SQL/AWS Job in Gurgaon, Haryana, India
The Nielsen Company
Gurgaon, Haryana, India
The Nielsen Company hiring Data Engineer - Python/SQL/AWS Job in Mumbai, Maharashtra, India
The Nielsen Company
Mumbai, Maharashtra, India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Bengaluru, Karnataka, India
Mobile Programming LLC
Bengaluru, Karnataka, India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Hyderabad, Telangana, India
Mobile Programming LLC
Hyderabad, Telangana, India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Pune, Maharashtra, India
Mobile Programming LLC
Pune, Maharashtra, India
Digihelic Solutions Private Limited hiring AWS Data Engineer - Python/SQL Job in India
Digihelic Solutions Private Limited
India
NonStop hiring Data Engineer - AWS/ETL/Python Job in Pune, Maharashtra, India
NonStop
Pune, Maharashtra, India
Mpowerplus hiring AWS Data Engineer - Python/Spark Job in Bengaluru, Karnataka, India
Mpowerplus
Bengaluru, Karnataka, India
Servhigh Global Services Private Limited hiring Senior Data Engineer - AWS/Python Job in Hyderabad, Telangana, India
Servhigh Global Services Private Limited
Hyderabad, Telangana, India
Accenture hiring - Data Engineer - Python/Aws Lambda Job in Bengaluru, Karnataka, India
Accenture
Bengaluru, Karnataka, India
Mobile Programming LLC hiring Senior Data Engineer - AWS/Python Job in Gurugram, Haryana, India
Mobile Programming LLC
Gurugram, Haryana, India
Mastech Digital hiring AWS Data Engineer - ETL/Python Job in Bengaluru, Karnataka, India
Mastech Digital
Bengaluru, Karnataka, India
Confidential hiring Senior Data Engineer(Python, AWS) Job in Gurugram, Haryana, India
Confidential
Gurugram, Haryana, India
Rapsys Technologies hiring Data Engineer - AWS/Python/Airflow Job in Bengaluru, Karnataka, India
Rapsys Technologies
Bengaluru, Karnataka, India

Unlock Your Data Engineer Potential: Insight & Career Growth Guide


Real-time Data Engineer Jobs Trends (Graphical Representation)

Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph here. Uncover the dynamic job market trends for Data Engineer in Chennai, India, highlighting market share and opportunities for professionals in Data Engineer roles.

676350 Jobs in India
676350
13096 Jobs in Chennai
13096
Download Data Engineer Jobs Trends in Chennai and India

Are You Looking for Data Engineer _ AWS + Python Job?

Great news! is currently hiring and seeking a Data Engineer _ AWS + Python to join their team. Feel free to download the job details.

Wait no longer! Are you also interested in exploring similar jobs? Search now: .

The Work Culture

An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at Fractal adheres to the cultural norms as outlined by Expertini.

The fundamental ethical values are:

1. Independence

2. Loyalty

3. Impartiapty

4. Integrity

5. Accountabipty

6. Respect for human rights

7. Obeying India laws and regulations

What Is the Average Salary Range for Data Engineer _ AWS + Python Positions?

The average salary range for a varies, but the pay scale is rated "Standard" in Chennai. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.

What Are the Key Qualifications for Data Engineer _ AWS + Python?

Key qualifications for Data Engineer _ AWS + Python typically include Computer Occupations and a list of qualifications and expertise as mentioned in the job specification. The generic skills are mostly outlined by the . Be sure to check the specific job listing for detailed requirements and qualifications.

How Can I Improve My Chances of Getting Hired for Data Engineer _ AWS + Python?

To improve your chances of getting hired for Data Engineer _ AWS + Python, consider enhancing your skills. Check your CV/Résumé Score with our free Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.

Interview Tips for Data Engineer _ AWS + Python Job Success

Fractal interview tips for Data Engineer _ AWS + Python

Here are some tips to help you prepare for and ace your Data Engineer _ AWS + Python job interview:

Before the Interview:

Research: Learn about the Fractal's mission, values, products, and the specific job requirements and get further information about

Other Openings

Practice: Prepare answers to common interview questions and rehearse using the STAR method (Situation, Task, Action, Result) to showcase your skills and experiences.

Dress Professionally: Choose attire appropriate for the company culture.

Prepare Questions: Show your interest by having thoughtful questions for the interviewer.

Plan Your Commute: Allow ample time to arrive on time and avoid feeling rushed.

During the Interview:

Be Punctual: Arrive on time to demonstrate professionalism and respect.

Make a Great First Impression: Greet the interviewer with a handshake, smile, and eye contact.

Confidence and Enthusiasm: Project a positive attitude and show your genuine interest in the opportunity.

Answer Thoughtfully: Listen carefully, take a moment to formulate clear and concise responses. Highlight relevant skills and experiences using the STAR method.

Ask Prepared Questions: Demonstrate curiosity and engagement with the role and company.

Follow Up: Send a thank-you email to the interviewer within 24 hours.

Additional Tips:

Be Yourself: Let your personality shine through while maintaining professionalism.

Be Honest: Don't exaggerate your skills or experience.

Be Positive: Focus on your strengths and accomplishments.

Body Language: Maintain good posture, avoid fidgeting, and make eye contact.

Turn Off Phone: Avoid distractions during the interview.

Final Thought:

To prepare for your Data Engineer _ AWS + Python interview at Fractal, research the company, understand the job requirements, and practice common interview questions.

Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the Fractal's products or services and be prepared to discuss how you can contribute to their success.

By following these tips, you can increase your chances of making a positive impression and landing the job!

How to Set Up Job Alerts for Data Engineer _ AWS + Python Positions

Setting up job alerts for Data Engineer _ AWS + Python is easy with India Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!