Job Description-
- 5+ years of experience in data engineering, with a focus on cloud-based solutions.
- · Extensive experience with Google Cloud Platform (GCP) and its data services, including
- Big Query, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer.
- · Proven track record of designing and building scalable data pipelines and architectures.
- · Experience with ETL tools and processes.
- · Design, develop, and maintain robust and scalable data pipelines using GCP services such as Dataflow, Pub/Sub, Cloud Functions, and Cloud Composer.
- · Implement ETL (Extract, Transform, Load) processes to ingest data from various sources into GCP data warehouses like Big Query.
- Technical Skills:
- · Proficiency in SQL and experience with database design and optimization.
- · Strong programming skills in Python, Java, or other relevant languages.
- · Experience with data modeling, data warehousing, and big data processing frameworks.
- · Familiarity with data visualization tools (e.g., Looker, Data Studio) is a plus.
- · Knowledge of machine learning workflows and tools is an advantage.
- Soft Skills:
- · Strong analytical and problem-solving skills.
- · Excellent communication and collaboration abilities.
- · Ability to work in a fast-paced environment and manage multiple tasks concurrently.
- · Leadership skills and experience mentoring junior engineers.
Skills Required
Gcp, BigQuery, Data Engineer, Etl