- Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: GCP Data Engineer.
Urgent! GCP Data Engineer Job Opening In New Delhi – Now Hiring Brillio
About the Company:
Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption.
Brillio, renowned for its world-class professionals, referred to as Brillians, distinguishes itself through their capacity to seamlessly integrate cutting-edge digital and design thinking skills with an unwavering dedication to client satisfaction.
Brillio takes pride in its status as an employer of choice, consistently attracting the most exceptional and talented individuals due to its unwavering emphasis on contemporary, groundbreaking technologies, and exclusive digital projects.
Brillio's relentless commitment to providing an exceptional experience to its Brillians and nurturing their full potential consistently garners them the Great Place to Work® certification year after year.
About the Role:
We’re hiring a Senior GCP Data Engineer to design and build scalable data solutions using Google Cloud’s advanced tools.
This role demands hands-on expertise in BigQuery, Dataflow, Airflow, and Python, with a strong foundation in SQL and large-scale data architecture.
You’ll work on high-impact analytics platforms, collaborating across teams to deliver clean, secure, and efficient data pipelines.
Required Skills:
- 4+ years of experience in the data engineering field is preferred
- 3+ years of Hands-on experience in GCP cloud data implementation suite such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage
- Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms
- Hands on Strong Experience in the below technology
1.
GBQ Query
2.
Python
3.
Apache Airflow
4.
SQL (BigQuery preferred)
- Extensive hands-on experience working with data using SQL and Python
- Cloud Functions.
Comparable skills in AWS and other cloud Big Data Engineering space is considered
- Experience with agile development methodologies
- Excellent verbal and written communications skills with the ability to clearly present ideas, concepts, and solutions
Qualifications
- Bachelor's Degree in Computer Science, Information Technology, or closely related discipline
Responsibilities:
- Design, build, and maintain scalable data pipelines using GCP tools such as BigQuery, Dataflow (Apache Beam), Pub/Sub, and Airflow/Composer
- Architect and operationalize large-scale data warehouses, data lakes, and analytics platforms
- Write efficient, production-grade code in Python and SQL for data transformation and analysis
- Implement and manage Cloud Functions and other GCP-native services for automation and integration
- Collaborate with cross-functional teams to understand data requirements and deliver robust solutions
- Ensure data quality, security, and performance across all stages of the pipeline
- Participate in Agile ceremonies and contribute to iterative development cycles
- Communicate technical concepts clearly to stakeholders through documentation and presentations
Preferred Skills
- Comparable skills in AWS and other cloud Big Data Engineering space is considered
For apply click here: https://tinyurl.com/bdesksnc
✨ Smart • Intelligent • Private • Secure
Practice for Any Interview Q&A (AI Enabled)
Predict interview Q&A (AI Supported)
Mock interview trainer (AI Supported)
Ace behavioral interviews (AI Powered)
Record interview questions (Confidential)
Master your interviews
Track your answers (Confidential)
Schedule your applications (Confidential)
Create perfect cover letters (AI Supported)
Analyze your resume (NLP Supported)
ATS compatibility check (AI Supported)
Optimize your applications (AI Supported)
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
European Union Recommended
Institution Recommended
Institution Recommended
Researcher Recommended
IT Savvy Recommended
Trades Recommended
O*NET Supported
Artist Recommended
Researchers Recommended
Create your account
Access your account
Create your professional profile
Preview your profile
Your saved opportunities
Reviews you've given
Companies you follow
Discover employers
O*NET Supported
Common questions answered
Help for job seekers
How matching works
Customized job suggestions
Fast application process
Manage alert settings
Understanding alerts
How we match resumes
Professional branding guide
Increase your visibility
Get verified status
Learn about our AI
How ATS ranks you
AI-powered matching
Join thousands of professionals who've advanced their careers with our platform
Unlock Your GCP Data Potential: Insight & Career Growth Guide
Real-time GCP Data Jobs Trends in New Delhi, India (Graphical Representation)
Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph below. This graph displays the job market trends for GCP Data in New Delhi, India using a bar chart to represent the number of jobs available and a trend line to illustrate the trend over time. Specifically, the graph shows 142959 jobs in India and 4825 jobs in New Delhi. This comprehensive analysis highlights market share and opportunities for professionals in GCP Data roles. These dynamic trends provide a better understanding of the job market landscape in these regions.
Great news! Brillio is currently hiring and seeking a GCP Data Engineer to join their team. Feel free to download the job details.
Wait no longer! Are you also interested in exploring similar jobs? Search now: GCP Data Engineer Jobs New Delhi.
An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at Brillio adheres to the cultural norms as outlined by Expertini.
The fundamental ethical values are:The average salary range for a GCP Data Engineer Jobs India varies, but the pay scale is rated "Standard" in New Delhi. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.
Key qualifications for GCP Data Engineer typically include Computer Occupations and a list of qualifications and expertise as mentioned in the job specification. Be sure to check the specific job listing for detailed requirements and qualifications.
To improve your chances of getting hired for GCP Data Engineer, consider enhancing your skills. Check your CV/Résumé Score with our free Resume Scoring Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.
Here are some tips to help you prepare for and ace your job interview:
Before the Interview:To prepare for your GCP Data Engineer interview at Brillio, research the company, understand the job requirements, and practice common interview questions.
Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the Brillio's products or services and be prepared to discuss how you can contribute to their success.
By following these tips, you can increase your chances of making a positive impression and landing the job!
Setting up job alerts for GCP Data Engineer is easy with India Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!