Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Associate Consultant / Manager Consulting Expert Data Streaming Architect – Real Time Replication.
India Jobs Expertini

Urgent! Associate Consultant / Manager Consulting Expert- Data Streaming Architect – Real-Time Replication Job Opening In Chennai – Now Hiring CGI

Associate Consultant / Manager Consulting Expert Data Streaming Architect – Real Time Replication



Job description

Position Description:

Company Profile:
Founded in , CGI is among the largest independent IT and business consulting services firms in the world.

With 94, consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions.

CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results.

CGI Fiscal reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB).

Learn more at .

Position: Associate Consultant / Manager Consulting Expert- Data Streaming Architect – Real-Time Replication
Experience: 10-16 years
Category: Software Development/ Engineering
Location: Chennai/Mumbai
Shift Timing-General Shift
Position ID: J-
Employment Type: Full Time

Education Qualification: Bachelor's degree in computer science or related field or higher with minimum 10 years of relevant experience.

We are seeking a Data Streaming Architect to lead the design and stabilization of large-scale replication pipelines into Snowflake.

The source systems generate heavy transactional volumes that must be captured and replicated with strict SLAs.
You will be responsible for building and optimizing real-time CDC pipelines (Oracle LogMiner / SQL Server CDC / PostgreSQL CDC -> Kafka -> Snowflake) that deliver data with near real time, even under heavy workloads.

This is a hands-on architecture role requiring deep technical expertise, operational maturity, and the ability to mentor engineers while engaging with cross-functional stakeholders.

Your future duties and responsibilities:

• Architect, stabilize, and scale real-time CDC pipelines from Oracle, SQL Server, and PostgreSQL into Snowflake.
• Ensure replication consistently meets the near real time, even during high-volume transactional loads.
• Optimize Oracle LogMiner, SQL Server CDC, and PostgreSQL logical decoding / WAL-based replication for performance and integrity.
• Design and operate robust Kafka infrastructure (topics, partitions, replication, retention, lag management, schema registry).
• Configure and tune Kafka Connect / Debezium connectors and evaluate GoldenGate or other CDC tools where appropriate.
• Implement strong error handling, checkpointing, replay & backfill strategies to ensure fault-tolerance.
• Define and enforce schema evolution strategies with Avro, Protobuf, or JSON serialization.
• Collaborate with DBAs to manage redo logs, WAL files, and CDC configurations without impacting source systems.
• Design and optimize Snowflake ingestion (Snowpipe, streaming ingestion, staging) for high-throughput and low-latency delivery.
• Set up monitoring and observability: lag tracking, throughput, retries, failure alerts, anomaly detection.
• Lead capacity planning, scaling, and performance tuning to handle spikes gracefully.
• Establish best practices, governance, and technical standards for streaming data pipelines.
• Mentor engineers, review designs, and guide the team on modern streaming practices.

Required qualifications to be successful in this role:

Must to have Skills-

• Experience: 10+ years in data engineering / architecture, with at least 3–5 years in real-time streaming and CDC replication.
• Databases: Deep expertise with Oracle redo logs / LogMiner / GoldenGate, SQL Server CDC, and PostgreSQL logical decoding / WAL.
• Streaming Platforms: Advanced hands-on with Apache Kafka, Kafka Connect, and Debezium.
• Snowflake Integration: Strong knowledge of Snowpipe, streaming ingestion, staging, clustering.
• Performance Engineering: Proven ability to tune pipelines to achieve 15 minutes replication lag consistently.
• Data Consistency: Strong in exactly-once semantics, idempotent writes, deduplication, ordering guarantees.
• Monitoring & Ops: Experience with Prometheus, Grafana, Confluent Control Center, JMX metrics, ELK/Splunk.
• Programming: Proficiency in Java/Scala or Python for custom connectors, orchestration, or automation.
• Cloud/Infra: Experience in AWS / Azure / GCP, containerization (Docker, Kubernetes), and infrastructure-as-code.
• Soft Skills: Problem-solving under SLA pressure, stakeholder communication, and team mentorship.

Good to have Skills-

• Stream processing engines (Flink, Kafka Streams, Spark Structured Streaming, ksqlDB).
• Confluent Enterprise features (Tiered Storage, Schema Registry, RBAC).
• Metadata/lineage tools and data governance/security exposure.

Skills:

  • Data Engineering
  • Snowflake
  • Oracle
  • Postgre SQL

  • Required Skill Profession

    Other General



    Your Complete Job Search Toolkit

    ✨ Smart • Intelligent • Private • Secure

    Start Using Our Tools

    Join thousands of professionals who've advanced their careers with our platform

    Rate or Report This Job
    If you feel this job is inaccurate or spam kindly report to us using below form.
    Please Note: This is NOT a job application form.


      Unlock Your Associate Consultant Potential: Insight & Career Growth Guide