Job Description
<p><p><b>Position : </b> Data Architect Data Engineering : </b> 10+ Years (7+ in Data : </b> Bengaluru, Telangana, Pune, Noida, Chennai, Faridabad, Type : </b> Description : </b></p><p><br/></p>Looking for an experienced Data Architect to design and optimize cloud-native, enterprise-scale data platforms.
The role involves driving data strategy, building scalable pipelines, and enabling analytics & AI/ML Skills : </b></p><p><br/></p>- 10+ yrs IT exp.
(7+ in Data Cloud platforms : AWS, Azure, GCP (S3, Glue, Redshift, Synapse, BigQuery, etc.)<br/><br/></p><p>- Big Data : Spark, Hadoop, Databricks<br/><br/></p><p>- SQL & NoSQL, Data Modeling, Schema Optimization<br/><br/></p><p>- Python/Java/Scala programming<br/><br/></p><p>- Streaming : Kafka, Kinesis, Pub/Sub<br/><br/></p><p>- ETL tools : Informatica, Talend, Airflow, DBT<br/><br/></p><p>- Governance, lineage, data security frameworks<br/><br/></p><p>Preferred : Data Mesh/Fabric, Lakehouse, Docker/Kubernetes, MDM, Cloud Overview : </b><br/></p><p><br/></p><p>We are seeking an accomplished Data Architect with deep expertise in designing, building, and optimizing cloud-native, enterprise-scale data platforms.
The ideal candidate will define and drive the organizations data strategy, ensure scalable and secure data architecture, and enable advanced analytics and AI/ML initiatives.
This role requires strong technical depth, architectural leadership, and the ability to collaborate with diverse stakeholders to deliver high-performance data Responsibilities : </b></p><p><br/></p>- Define and own the data architecture vision, strategy, and roadmap for enterprise-level platforms.<br/><br/></p><p>- Design cloud-native, scalable, and secure data solutions across AWS, Azure, and/or GCP.<br/><br/></p><p>- Establish data modeling standards, schema optimization techniques, and data design patterns.<br/><br/></p><p>- Architect data lakes, data warehouses, lakehouses, and data mesh/fabric solutions.<br/><br/></p><p>- Build and optimize ETL/ELT pipelines using tools such as Informatica, Talend, Airflow, DBT, Glue.<br/><br/></p><p>- Leverage big data technologies (Spark, Hadoop, Databricks) for large-scale batch and streaming workloads.<br/><br/></p><p>- Implement real-time streaming pipelines with Kafka, Kinesis, or Pub/Sub.<br/><br/></p><p>- Guide teams in data ingestion, transformation, storage, and consumption frameworks.<br/><br/></p><p>- Implement data governance frameworks, metadata management, and lineage tracking.<br/><br/></p><p>- Define policies for data privacy, security, and compliance (GDPR, HIPAA, etc.).<br/><br/></p><p>- Enforce data quality standards, validation rules, and stewardship practices.<br/><br/></p><p>- Partner with business stakeholders, data scientists, and application teams to translate requirements into architectural solutions.<br/><br/></p><p>- Provide technical leadership and mentorship to data engineers and solution architects.<br/><br/></p><p>- Lead architectural reviews, POCs, and evaluations of new tools and technologies.<br/><br/></p><p>- Drive adoption of modern paradigms such as data mesh, data fabric, and lakehouse architectures.<br/><br/></p><p>- Stay updated on emerging trends in cloud, AI/ML, containerization, and data ecosystems.<br/><br/></p><p>- Continuously optimize cost, performance, and scalability of cloud-based data Skills & Qualifications : </b></p><p><br/></p>- 1015 years in IT, with at least 7+ years in data engineering and architecture.<br/><br/></p><p>- Proven track record of designing and delivering enterprise-scale, cloud-native data solutions.<br/><br/></p><p>- Cloud Platforms : AWS (S3, Glue, Redshift), Azure (Synapse, Data Lake, Data Factory), GCP (BigQuery, Pub/Sub).<br/><br/></p><p>- Big Data & Processing : Spark, Hadoop, Databricks.<br/><br/></p><p>- Databases : Strong SQL (RDBMS), NoSQL (MongoDB, Cassandra, DynamoDB).<br/><br/></p><p>- Programming : Proficiency in Python, Java, or Scala.<br/><br/></p><p>- Streaming : Kafka, Kinesis, or Google Pub/Sub.<br/><br/></p><p>- ETL/ELT Tools : Informatica, Talend, Airflow, DBT, or equivalent.<br/><br/></p><p>- Data Governance & Security : Data lineage, cataloging, encryption, access control.<br/><br/></p><p>- Experience with Data Mesh, Data Fabric, or Lakehouse architectures.<br/><br/></p><p>- Familiarity with Docker, Kubernetes, and container-based deployments.<br/><br/></p><p>- Exposure to MDM (Master Data Management) practices.<br/><br/></p><p>- Cloud certifications (AWS, Azure, or GCP) strongly preferred.<br/><br/></p><p>- Strong leadership and stakeholder management.<br/><br/></p><p>- Excellent communication and presentation abilities.<br/><br/></p><p>- Analytical mindset with problem-solving skills.<br/><br/></p><p>- Bachelors or Masters degree in Computer Science, Information Technology, or a related field.<br/><br/></p><p>- Competitive salary as per industry standards.<br/><br/></p><p>- Opportunity to architect solutions for large-scale, global enterprises.<br/><br/></p><p>- Work with cutting-edge cloud, big data, and AI/ML technologies.<br/><br/></p><p>- Leadership role with career advancement opportunities.</p><br/></p> (ref:hirist.tech)