Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: (Urgent Search) Data Engineer.
India Jobs Expertini

Urgent! (Urgent Search) Data Engineer Job Opening In Hyderabad – Now Hiring 100x.inc

(Urgent Search) Data Engineer



Job description

About Costco Wholesale Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries.

They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.

About Costco Wholesale India At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier.

Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.

Position Title: Data Engineer L3 Job Description: Roles & Responsibilities: Lead the design and implementation of enterprise data platforms: Architect and oversee the deployment of scalable, reliable, and secure data infrastructure for large organizations.

Drive innovation and adoption of new technologies: Research and integrate cutting-edge tools and frameworks for data ingestion, processing, and governance.

Mentor and guide junior and mid-level data engineers: Provide technical leadership, code reviews, and career development support.

Collaborate with stakeholders across teams: Align data engineering initiatives with business objectives and ensure cross-functional alignment.

Establish and enforce data engineering best practices: Define standards for pipeline architecture, data quality, security, and compliance across the organization.

Present findings and recommendations to senior leadership: Communicate technical concepts and business impacts to executives and decision-makers.

Technical Skills: 8 – 12 years of experience Expert-level proficiency in programming, automation, and orchestration: Mastery of Python, and workflow orchestration tools.

Deep understanding of data storage, processing, and governance: Advanced knowledge of data warehousing, Lakehouse architectures, and real-time streaming.

Proven ability to build and deploy scalable data systems: Design and implement robust, production-grade data platforms on GCP.

Experience with big data technologies: Use Dataflow, Dataproc, Pub/sub, or similar for large-scale data processing.

Strong security and compliance expertise: Implement and enforce security controls, encryption, audit logging for data systems, and compliance standards or data systems.

Excellent communication and presentation skills: Articulate technical concepts and business value to diverse audiences.

Must Have Skills: Python, orchestration tools (e.G. Airflow, Cloud Composer) Data architecture: data lakes, warehouses, streaming (e.G. Pub/Sub, Dataflow, Dataproc) Experience with GCP and production-grade data platform deployment Data security, compliance, and governance standards Data modeling skills - Experience with different data modeling techniques (dimensional modeling, relational modeling, should be able to design scalable data models) SQL Skills level - Expert Deep understanding of bigquery, have experience on partitioning, clustering and performance optimizations Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write.


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Urgent Search Potential: Insight & Career Growth Guide