Key Responsibilities 
- Design and Implement Data Architectures: Architect and build scalable, end-to-end data solutions on GCP, encompassing data ingestion, transformation, storage, and consumption.
 
 
- Develop Data Pipelines: Design and develop ETL/ELT data pipelines using tools like Apache Airflow (Cloud Composer) and programming languages such as Python and SQL for batch and real-time processing.
 
 
- Create Data Models: Build logical and physical data models, including dimensional modelling and schema design, to support data warehousing, data lakes, and analytics.
 
 
- Ensure Data Quality and Governance: Establish and enforce data governance, security, and quality standards, implementing data validation and testing procedures.
 
 
- Collaborate with Stakeholders: Work with data engineers, business analysts, data scientists, and product owners to translate business requirements into technical data solutions.
 
 
- Optimize GCP Services: Optimize the performance and cost-effectiveness of GCP services, particularly Big Query, for analytics and data storage.
 
 
- Provide Technical Guidance: Lead architectural reviews, provide technical guidance on cloud-native data strategies, and mentor engineering teams on GCP best practices.
 
 
Required Skills and Knowledge 
- Google Cloud Platform (GCP): Expertise with GCP services like BigQuery, Cloud Storage, Cloud SQL, and Cloud Composer.
 
 
- Data Modelling: Proficiency in designing data models for data warehouses and data lakes.
 
 
- ETL/ELT: Experience with designing and building data pipelines using tools like Apache Airflow.
 
 
- Programming: Strong skills in SQL and Python for data processing and development.
 
 
- Data Governance: Understanding and ability to implement data governance, metadata management, and security policies.
 
 
- Collaboration: Strong communication skills to work with cross-functional teams and explain complex technical concepts.