Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: CDP Data Architect.
India Jobs Expertini

Urgent! CDP Data Architect Job Opening In Bengaluru – Now Hiring Acqueon

CDP Data Architect



Job description

About the Job


We are building a Customer Data Platform (CDP) designed to unlock the full potential of customer experience (CX) across our products and services.

This role offers the opportunity to design and scale a platform that unifies customer data from multiple sources, ensures data quality and governance, and provides a single source of truth for analytics, personalization, and engagement.

As a key member of the data engineering team, you will architect and implement the pipelines, storage layers, and integrations that power our CDP.

You’ll work with product, data science, and engineering stakeholders to deliver a robust platform that supports real-time decision-making, personalization at scale, and actionable customer insights.


This is a high-impact, hands-on engineering role where you’ll shape the data foundation that directly influences how we understand and serve our customers.


What you will be doing

  • Design and implement Five9’s Customer Data Platform solutions and enable production use cases for customers
  • Design and develop highly scalable and resilient services for ingesting large scale datasets
  • Demonstrate strong ownership by ensuring operational excellence with a sharp focus on monitoring, observability, and system reliability
  • Develop and orchestrate ETL/ELT pipelines using Apache Airflow
  • Collaborate with cross functional partners and lead technical initiatives independently end to end.
  • Design, build, and optimize distributed query engines such as Apache Spark, or Snowflake to support complex data workloads.
  • Write, review, or provide feedback on a technical design proposal from others.


Skills


  • 5-7 years of software/data engineering and data platform experience,
  • Extensive experience with data transformation and modeling, including advanced features and best practices
  • Deep knowledge of Snowflake for data warehousing, including optimization, security, and cost management
  • Good understanding of data streaming technologies such as Kafka/Kafka Connect
  • Strong knowledge of highly scalable distributed systems, microservices, Rest APIs
  • Strong proficiency in object oriented and/or functional programming language such as Java,.

    NET, Python and SQL for data processing, transformation, and pipeline development
  • Knowledge of Apache Airflow for workflow orchestration is nice to have
  • Understanding of containerization and Kubernetes concepts
  • Experience with the AWS cloud platform and infrastructure as code practices
  • Maintain high standards of code quality with keen eye for test automation and operational excellence.
  • Track record of delivering scalable data engineering solutions that support analytics, machine learning, and operational use cases
  • Excellent written and verbal communication and interpersonal skills
  • Bachelor's degree in Computer Science, Engineering or related field, or equivalent training, fellowship, or work experience


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your CDP Data Potential: Insight & Career Growth Guide