Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Python Databricks Developer.
India Jobs Expertini

Urgent! Python Databricks Developer Job Opening In India, India – Now Hiring Vertical Relevance

Python Databricks Developer



Job description

Salary: Competitive / Paid in Indian Rupee .

INR / Annual

Recommended Quick Links





What You Should Know About This Job


Job Title: Python Databricks Developer

Location: Pune

Experience: 6 to 10 Years

Employment Type: Full-Time


Job Summary:

We are looking for a skilled and experienced Python Databricks Developer who is proficient in developing scalable data pipelines, data transformation logic, and cloud-native analytics solutions using Python, Databricks, and AWS services.

The ideal candidate should have strong data engineering experience and be comfortable working in fast-paced, Agile environments.

Key Responsibilities:

  • Design and develop scalable ETL pipelines and data workflows using Databricks (PySpark) and Python.
  • Work on large-scale data ingestion, processing, and transformation from various sources.
  • Leverage AWS services (e.g., S3, Glue, Lambda, Redshift, EMR) for data storage, orchestration, and compute.
  • Optimize performance of Spark jobs and Databricks notebooks for large-scale data operations.
  • Collaborate with data architects, analysts, and business stakeholders to deliver robust data solutions.
  • Implement best practices for data quality, data governance, and security.
  • Participate in code reviews, testing, and deployment of data solutions in DevOps-driven environments.
  • Create and maintain technical documentation, data dictionaries, and process flows.

Required Skills & Experience:

  • Strong hands-on programming experience in Python.
  • Minimum 4+ years of experience working with Databricks, especially with PySpark and Delta Lake.
  • Experience in building and managing data pipelines and ETL processes in cloud environments, particularly AWS.
  • Solid understanding of distributed computing concepts and Spark performance optimization.
  • Hands-on experience with AWS services such as S3, Glue, Lambda, Redshift, Athena, CloudWatch, etc.
  • Experience with version control (e.g., Git), CI/CD tools, and workflow orchestration tools like Airflow or Databricks Jobs.
  • Knowledge of data modeling, data warehousing, and data lake architectures.
  • Strong problem-solving skills and the ability to work independently or in a team.

Preferred Qualifications:

  • Bachelor's or Master's degree in Computer Science, Information Technology, or related field.
  • Certification in AWS or Databricks is a plus.
  • Experience working in Agile environments with Jira or similar tools.
  • Familiarity with SQL and NoSQL databases is a plus.


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Python Databricks Potential: Insight & Career Growth Guide