• Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role.
India Jobs Expertini

Mid-Data Engineer with Python & Snowflake - Jefferies Job Opening In India, India – Now Hiring Capco


Job description

This job is with Capco, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community.

Please do not contact the recruiter directly.

Job Title:

Data Engineer with Python & Snowflake- Pune

About Us
Capco, a Wipro company, is a global technology and management consulting firm.

Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount.

With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors.

We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry.

The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business.

Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.

Role Description:
Key Skills: Data Engineering, Python, Snowflake, AWS, Git/ Bitbucket
Exp: 9+yrs
Location – Hinjewadi, Pune
Shift timings: 12:30PM- 9:30PM
3 days WFO (Tues, Wed, Thurs)

Technical Requirement
Job Summary
Job Description: Python & Snowflake Engineer with AI/Cortex Development
4+ years of experience in developing Data Engineering and data science projects using Snowflake/AI Cloud platform on AWS cloud.

Snow Park experience preferred.

Experience with different data modeling techniques is required.
4+ yrs experience with Python development.

Used tools like VS Code or anaconda, version control using Git or Bitbucket and Python unit testing frameworks.
Experience in building snowflake applications using Snowflake AI/Cortex platform (specifically cortex agents, cortex search and cortex LLM with understanding of context enrichment using Prompts or Retrieval-Augmented-Generation methods).
Deep understanding of implementing Object oriented programming in the Python, data structures like Pandas, data frames and writing clean and maintainable Engineering code.
Understanding multi-threading concepts, concurrency implementation using Python server-side python custom modules.
Implementing Object-Relational mapping in the python using frameworks like SQLAlchemy or equivalent.
Good at developing and deploying Python applications like lamda on AWS Cloud platform.
Good at deploying web applications on AWS Cloud using docker containers or Kubernetes with experience of using CI/CD pipelines.
Good at developing applications Snowpipe and Snowpark and moving the data from Cloud sources like AWS S3 and handling unstructured data from data lakes.
Good at Snowflake Account hierarchy models, Account-role-permissions strategy.
Good at Data sharing using preferably Internal Data Marketplace and Data Exchanges for various Listings.
Good at the Data Governance/Security concepts within Snowflake, Row/Column level dynamic data masking concepts using Snowflake Tags.
Good understanding of input query enrichment using Snowflake YAMLs and integrating with LLMs within Snowflake.
Candidate is good at understanding of Relevance search and building custom interaction applications with LLMs.
Nice to have experience in building Snowflake native applications using Streamlit and deploy onto AWS Cloud instances (EC2 or docker containers).
Candidate continuously improving functionality through experimentation, performance tuning and customer feedback.
Nice to have any application Cache implementation experience within Python web applications.

Nice to have duckdb with Apache arrow experience.
Nice to have implementing CI/CD pipelines within Snowflake applications.
Good at analytical skills, problem solving and communicate technical concepts clearly.
Experience using Agile and SCRUM methodologies and preferably with JIRA.

If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth.

For more information, visit www.capco.com.

Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Required Skill Profession

Computer Occupations


  • Job Details

Related Jobs

Movate hiring Snowflake Data Engineer with Python AND Tableau Job in Bengaluru, Karnataka, India
Movate
Bengaluru, Karnataka, India
Movate hiring Snowflake Data Engineer With Python And Tableau Job in Bengaluru, Karnataka, India
Movate
Bengaluru, Karnataka, India
Movate hiring Snowflake Data Engineer with Python AND Tableau Job in Bengaluru, Karnataka, India
Movate
Bengaluru, Karnataka, India
Movate hiring Snowflake Data Engineer with Python AND Tableau Job in Bengaluru, Karnataka, India
Movate
Bengaluru, Karnataka, India
Movate hiring Snowflake Data Engineer with Python AND Tableau Job in Bengaluru, Karnataka, India
Movate
Bengaluru, Karnataka, India
Movate hiring Snowflake data engineer with python and tableau Job in Bengaluru, Karnataka, India
Movate
Bengaluru, Karnataka, India
Movate hiring Snowflake Data Engineer with Python AND Tableau Job in Bengaluru, Karnataka, India
Movate
Bengaluru, Karnataka, India
Movate hiring Snowflake Data Engineer with Python AND Tableau Job in Bengaluru, Bengaluru, India
Movate
Bengaluru, Bengaluru, India
Movate hiring Snowflake Data Engineer with Python AND Tableau Job in bangalore, bangalore, India
Movate
bangalore, bangalore, India
GKMIT Pvt. Ltd. hiring Python Engineer - Mid Job in Multiple, RJ, India
GKMIT Pvt. Ltd.
Multiple, RJ, India
VDart Software Services Pvt. Ltd. hiring Data Warehouse Expert with Snowflake, Python & SQL Job in India
VDart Software Services Pvt. Ltd.
India
VDart Software Services Pvt. Ltd. hiring Data Warehouse Expert with Snowflake, Python & SQL Job in India
VDart Software Services Pvt. Ltd.
India
Strategic HR Solutions hiring Snowflake Data Engineer - Python/PySpark Job in Bengaluru, Karnataka, India
Strategic HR Solutions
Bengaluru, Karnataka, India
Confidential hiring Data Engineer: SQL, Snowflake, Python Job in Bengaluru, Karnataka, India
Confidential
Bengaluru, Karnataka, India
Confidential hiring Data Engineer: SQL, Snowflake, Python Job in Mumbai, Maharashtra, India
Confidential
Mumbai, Maharashtra, India
PRUDENT GLOBALTECH SOLUTIONS PRIVATE LIMITED hiring Data Engineer - Python/Snowflake DB Job in Hyderabad, Telangana, India
PRUDENT GLOBALTECH SOLUTIONS PRIVATE LIMITED
Hyderabad, Telangana, India
Movate hiring Data Engineer (Snowflake, Python, Tableau) Job in Bengaluru, Karnataka, India
Movate
Bengaluru, Karnataka, India
Strategic HR Solutions hiring Snowflake Data Engineer - Python/PySpark Job in Gurugram, Haryana, India
Strategic HR Solutions
Gurugram, Haryana, India
Strategic HR Solutions hiring Snowflake Data Engineer - Python/PySpark Job in Gurugram, Haryana, India
Strategic HR Solutions
Gurugram, Haryana, India
Rapsys Technologies hiring Data Engineer - Python/Snowflake DB Job in Pune, Maharashtra, India
Rapsys Technologies
Pune, Maharashtra, India
NTT DATA hiring ETL / Data Engineer with Snowflake Experience Job in Bengaluru, Karnataka, India
NTT DATA
Bengaluru, Karnataka, India
Gravity Infosolutions, Inc. hiring Data engineer with snowflake experience- contract Job in Delhi, Delhi, India
Gravity Infosolutions, Inc.
Delhi, Delhi, India
Gravity Infosolutions, Inc. hiring Data Engineer with Snowflake Experience- Contract Job in Delhi, Delhi, India
Gravity Infosolutions, Inc.
Delhi, Delhi, India
Gravity Infosolutions, Inc. hiring Data Engineer with Snowflake Experience- Contract Job in India
Gravity Infosolutions, Inc.
India
Gravity Infosolutions, Inc. hiring Data Engineer with Snowflake Experience- Contract Job in bangalore, bangalore, India
Gravity Infosolutions, Inc.
bangalore, bangalore, India
Gravity Infosolutions, Inc. hiring Data Engineer with Snowflake Experience- Contract Job in hyderabad, hyderabad, India
Gravity Infosolutions, Inc.
hyderabad, hyderabad, India

Unlock Your Mid Data Potential: Insight & Career Growth Guide


Real-time Mid Data Jobs Trends (Graphical Representation)

Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph here. Uncover the dynamic job market trends for Mid Data in India, India, highlighting market share and opportunities for professionals in Mid Data roles.

141445 Jobs in India
141445
9602 Jobs in India
9602
Download Mid Data Jobs Trends in India and India

Are You Looking for Mid Data Engineer with Python & Snowflake Jefferies Job?

Great news! is currently hiring and seeking a Mid Data Engineer with Python & Snowflake Jefferies to join their team. Feel free to download the job details.

Wait no longer! Are you also interested in exploring similar jobs? Search now: .

The Work Culture

An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at Capco adheres to the cultural norms as outlined by Expertini.

The fundamental ethical values are:

1. Independence

2. Loyalty

3. Impartiapty

4. Integrity

5. Accountabipty

6. Respect for human rights

7. Obeying India laws and regulations

What Is the Average Salary Range for Mid Data Engineer with Python & Snowflake Jefferies Positions?

The average salary range for a varies, but the pay scale is rated "Standard" in India. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.

What Are the Key Qualifications for Mid Data Engineer with Python & Snowflake Jefferies?

Key qualifications for Mid Data Engineer with Python & Snowflake Jefferies typically include Computer Occupations and a list of qualifications and expertise as mentioned in the job specification. The generic skills are mostly outlined by the . Be sure to check the specific job listing for detailed requirements and qualifications.

How Can I Improve My Chances of Getting Hired for Mid Data Engineer with Python & Snowflake Jefferies?

To improve your chances of getting hired for Mid Data Engineer with Python & Snowflake Jefferies, consider enhancing your skills. Check your CV/Résumé Score with our free Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.

Interview Tips for Mid Data Engineer with Python & Snowflake Jefferies Job Success

Capco interview tips for Mid Data Engineer with Python & Snowflake   Jefferies

Here are some tips to help you prepare for and ace your Mid Data Engineer with Python & Snowflake Jefferies job interview:

Before the Interview:

Research: Learn about the Capco's mission, values, products, and the specific job requirements and get further information about

Other Openings

Practice: Prepare answers to common interview questions and rehearse using the STAR method (Situation, Task, Action, Result) to showcase your skills and experiences.

Dress Professionally: Choose attire appropriate for the company culture.

Prepare Questions: Show your interest by having thoughtful questions for the interviewer.

Plan Your Commute: Allow ample time to arrive on time and avoid feeling rushed.

During the Interview:

Be Punctual: Arrive on time to demonstrate professionalism and respect.

Make a Great First Impression: Greet the interviewer with a handshake, smile, and eye contact.

Confidence and Enthusiasm: Project a positive attitude and show your genuine interest in the opportunity.

Answer Thoughtfully: Listen carefully, take a moment to formulate clear and concise responses. Highlight relevant skills and experiences using the STAR method.

Ask Prepared Questions: Demonstrate curiosity and engagement with the role and company.

Follow Up: Send a thank-you email to the interviewer within 24 hours.

Additional Tips:

Be Yourself: Let your personality shine through while maintaining professionalism.

Be Honest: Don't exaggerate your skills or experience.

Be Positive: Focus on your strengths and accomplishments.

Body Language: Maintain good posture, avoid fidgeting, and make eye contact.

Turn Off Phone: Avoid distractions during the interview.

Final Thought:

To prepare for your Mid Data Engineer with Python & Snowflake Jefferies interview at Capco, research the company, understand the job requirements, and practice common interview questions.

Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the Capco's products or services and be prepared to discuss how you can contribute to their success.

By following these tips, you can increase your chances of making a positive impression and landing the job!

How to Set Up Job Alerts for Mid Data Engineer with Python & Snowflake Jefferies Positions

Setting up job alerts for Mid Data Engineer with Python & Snowflake Jefferies is easy with India Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!