Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Senior Data Engineer GTG BLR.
India Jobs Expertini

Urgent! Senior Data Engineer - GTG - BLR Job Opening In Bengaluru – Now Hiring Getinz Techno Services

Senior Data Engineer GTG BLR



Job description

Our client is a leading media and digital broadcasting organization reaching millions across the U.S., known for its trusted local news and innovative storytelling.

It values integrity, inclusion, and creativity while empowering employees to make meaningful impact.

The company fosters continuous learning and growth in a collaborative, purpose-driven culture.

Role Overview:
We are looking for a Senior Data Engineer (Platform) to design, build, and optimize the data infrastructure that powers our Customer Data Platform.

This role is hands-on and focuses on delivering efficient, scalable, and reliable data solutions within our Snowflake environment
.

Key Responsibilities:

  • Design, develop, and maintain large-scale batch and streaming data pipelines.

  • Build and optimize production-grade ETL/ELT workflows using tools such as dbt, Airflow, and Python/Scala/Java.

  • Model, implement, and manage dimensional and relational data models in Snowflake following architectural standards.

  • Write clean, maintainable, and well-tested code for data processing and transformation.

  • Collaborate with DevOps and Platform Engineering teams to ensure pipelines are reliable, performant, and monitored.

  • Participate in code reviews and promote best engineering practices.

  • Diagnose and resolve data pipeline issues related to performance, reliability, and quality.

Required Skills:
Must-Have:

  • 10+ years of hands-on experience in data engineering.

  • Strong programming skills in Python, Scala, or Java.

  • Expert-level SQL proficiency with a strong focus on query optimization.

  • Proven and deep experience with Snowflake or similar modern cloud data warehouses.

  • Solid track record of building and orchestrating data pipelines using Airflow, dbt, or equivalent tools.

  • Hands-on experience with Apache Spark or other big data technologies.

  • Experience with containerization (Docker, Kubernetes) and CI/CD workflows.

Good-to-Have:

  • Familiarity with analytics and customer data platforms such as RudderStack, Amplitude, Kochava, or Braze.

  • Experience with real-time data streaming tools like Kafka or Kinesis.

  • Snowflake SnowPro Certification is a plus.

Eligibility / Qualifications:

  • Education: BTech/ BE

  • Experience: 10-14 years

Other Details:

  • Notice Period:Immediate / 15 days

  • Work Type: Fulltime / Permanent


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Senior Data Potential: Insight & Career Growth Guide