Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: AWS Databricks Developer.
India Jobs Expertini

Urgent! AWS Databricks Developer Job Opening In Bengaluru – Now Hiring Coforge

AWS Databricks Developer



Job description

We are seeking an experienced AWS Databricks Developer to design, develop, and maintain scalable data processing and analytics solutions on the AWS and Databricks platform.

The ideal candidate will have strong expertise in building data pipelines, optimizing performance, and integrating multiple AWS services to deliver high-quality data solutions.


Key Responsibilities:

  • Overall, 5 to 8 years of experience in IT Industry.

    Min 6 years of experience working on Data Engineering.

  • Design and develop solution in combination of Cloudera, Palantir and Databricks.

  • Develop Python-based automation scripts to optimize workflows and processes.

  • Integrate Databricks with AWS ecosystem components such as S3, Glue, Athena, Redshift, and Lambda
  • Define architecture for data-driven solutions.

  • Ensure scalability, security, and performance of deployed solutions.

  • Collaborate with business teams, data engineers, and developers to align solutions with business goals.

Mandatory Skills:

  • Databricks :
  • Expertise in design jobs and workloads.

  • Proficiency in pyspark / scala for data manipulation, transformation and analysis
  • Optimization of Databricks clusters and notebooks.

  • Python : Hands-on experience in automation and scripting.

  • Cloudera : Strong knowledge of Spark, Hive, Impala, HDFS, Kafa, HBase and Cluster management .

  • Palantir: Hands on experience with analytics, data fusion and developing analytical workflows
  • DevOps Basics : Familiarity with Jenkins CI/CD pipelines .

  • Integrate Databricks with AWS ecosystem components such as S3, Glue, Athena, Redshift, and Lambda
  • Communication : Excellent verbal and written communication skills.

  • Fast Learner : Ability to quickly grasp new technologies and adapt to changing requirements.

Good to have skills :

  • Familiarity with containerization technologies (Docker, Kubernetes).

  • Experience with real-time streaming technologies (e.g., Kafka Streams, Spark Streaming).

  • Exposure to machine learning operationalization (MLOps) workflows.

  • Background in regulated industries with strict data compliance requirements.

Additional Information:

Certifications: Databricks Certified Associate

  • Provide solutioning for web applications and data pipelines.

  • Solutioning data analytics requirements with Palantir.

  • Stay updated with emerging technologies and quickly adapt to new tools and frameworks.

  • Work with CI/CD pipelines.


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your AWS Databricks Potential: Insight & Career Growth Guide