Job Title : GCP Data Engineer
Job Location – Chennai / Hyderabd / Bangalore / Pune / Gurgoan/ Noida / NCR
Experience: 5 to 10 Years of experience in IT industry in Planning, deploying, and configuring GCP based solutions.
Requirement:
- Mandatory to have knowledge of Big Data Architecture Patterns and experience in delivery of BigData and Hadoop Ecosystems.
- Strong experience required in GCP.
Must have done multiple large projects with GCP Big Query and ETL - Experience working in GCP based Big Data deployments (Batch/Realtime) leveraging components like GCP Big Query, air flow, Google Cloud Storage, Data fusion, Data flow, Data Proc etc
- Should have experience in SQL/Data Warehouse
- Expert in programming languages like Java, Hadoop, Scala
- Expert in at least one distributed data processing frameworks: like Spark (Core, Streaming, SQL), Storm or Flink etc.
- Should have worked on any of Orchestration tools – Oozie, Airflow, Ctr-M or similar, Kubernetes.
- Worked on Performance Tuning, Optimization and Data security
- Preferred Experience and Knowledge:
- Excellent understanding of data technologies landscape / ecosystem.
- Good Exposure in development with CI / CD pipelines.
Knowledge of containerization, orchestration and Kubernetes engine would be an added advantage. - Well versed with pros and cons of various database technologies like Relational, BigQuery, Columnar databases, NOSQL
- Exposure in data governance, catalog, lineage and associated tools would be an added advantage.
- Well versed with SaaS, PaaS and IaaS concepts and can drive clients to a decisions
- Good skills in Python Language and PYSPARK
Key word:
GCP, BigQuery, Python, Pyspark