- Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Architect Python/SQL.
Urgent! Data Architect - Python/SQL Job Opening In Bengaluru – Now Hiring Careerfit.ai
<p><p><b>Position : </b> Data Architect Data Engineering : </b> 10+ Years (7+ in Data : </b> Bengaluru, Telangana, Pune, Noida, Chennai, Faridabad, Type : </b> Description : </b></p><p><br/></p>Looking for an experienced Data Architect to design and optimize cloud-native, enterprise-scale data platforms.
The role involves driving data strategy, building scalable pipelines, and enabling analytics & AI/ML Skills : </b></p><p><br/></p>- 10+ yrs IT exp.
(7+ in Data Cloud platforms : AWS, Azure, GCP (S3, Glue, Redshift, Synapse, BigQuery, etc.)<br/><br/></p><p>- Big Data : Spark, Hadoop, Databricks<br/><br/></p><p>- SQL & NoSQL, Data Modeling, Schema Optimization<br/><br/></p><p>- Python/Java/Scala programming<br/><br/></p><p>- Streaming : Kafka, Kinesis, Pub/Sub<br/><br/></p><p>- ETL tools : Informatica, Talend, Airflow, DBT<br/><br/></p><p>- Governance, lineage, data security frameworks<br/><br/></p><p>Preferred : Data Mesh/Fabric, Lakehouse, Docker/Kubernetes, MDM, Cloud Overview : </b><br/></p><p><br/></p><p>We are seeking an accomplished Data Architect with deep expertise in designing, building, and optimizing cloud-native, enterprise-scale data platforms.
The ideal candidate will define and drive the organizations data strategy, ensure scalable and secure data architecture, and enable advanced analytics and AI/ML initiatives.
This role requires strong technical depth, architectural leadership, and the ability to collaborate with diverse stakeholders to deliver high-performance data Responsibilities : </b></p><p><br/></p>- Define and own the data architecture vision, strategy, and roadmap for enterprise-level platforms.<br/><br/></p><p>- Design cloud-native, scalable, and secure data solutions across AWS, Azure, and/or GCP.<br/><br/></p><p>- Establish data modeling standards, schema optimization techniques, and data design patterns.<br/><br/></p><p>- Architect data lakes, data warehouses, lakehouses, and data mesh/fabric solutions.<br/><br/></p><p>- Build and optimize ETL/ELT pipelines using tools such as Informatica, Talend, Airflow, DBT, Glue.<br/><br/></p><p>- Leverage big data technologies (Spark, Hadoop, Databricks) for large-scale batch and streaming workloads.<br/><br/></p><p>- Implement real-time streaming pipelines with Kafka, Kinesis, or Pub/Sub.<br/><br/></p><p>- Guide teams in data ingestion, transformation, storage, and consumption frameworks.<br/><br/></p><p>- Implement data governance frameworks, metadata management, and lineage tracking.<br/><br/></p><p>- Define policies for data privacy, security, and compliance (GDPR, HIPAA, etc.).<br/><br/></p><p>- Enforce data quality standards, validation rules, and stewardship practices.<br/><br/></p><p>- Partner with business stakeholders, data scientists, and application teams to translate requirements into architectural solutions.<br/><br/></p><p>- Provide technical leadership and mentorship to data engineers and solution architects.<br/><br/></p><p>- Lead architectural reviews, POCs, and evaluations of new tools and technologies.<br/><br/></p><p>- Drive adoption of modern paradigms such as data mesh, data fabric, and lakehouse architectures.<br/><br/></p><p>- Stay updated on emerging trends in cloud, AI/ML, containerization, and data ecosystems.<br/><br/></p><p>- Continuously optimize cost, performance, and scalability of cloud-based data Skills & Qualifications : </b></p><p><br/></p>- 1015 years in IT, with at least 7+ years in data engineering and architecture.<br/><br/></p><p>- Proven track record of designing and delivering enterprise-scale, cloud-native data solutions.<br/><br/></p><p>- Cloud Platforms : AWS (S3, Glue, Redshift), Azure (Synapse, Data Lake, Data Factory), GCP (BigQuery, Pub/Sub).<br/><br/></p><p>- Big Data & Processing : Spark, Hadoop, Databricks.<br/><br/></p><p>- Databases : Strong SQL (RDBMS), NoSQL (MongoDB, Cassandra, DynamoDB).<br/><br/></p><p>- Programming : Proficiency in Python, Java, or Scala.<br/><br/></p><p>- Streaming : Kafka, Kinesis, or Google Pub/Sub.<br/><br/></p><p>- ETL/ELT Tools : Informatica, Talend, Airflow, DBT, or equivalent.<br/><br/></p><p>- Data Governance & Security : Data lineage, cataloging, encryption, access control.<br/><br/></p><p>- Experience with Data Mesh, Data Fabric, or Lakehouse architectures.<br/><br/></p><p>- Familiarity with Docker, Kubernetes, and container-based deployments.<br/><br/></p><p>- Exposure to MDM (Master Data Management) practices.<br/><br/></p><p>- Cloud certifications (AWS, Azure, or GCP) strongly preferred.<br/><br/></p><p>- Strong leadership and stakeholder management.<br/><br/></p><p>- Excellent communication and presentation abilities.<br/><br/></p><p>- Analytical mindset with problem-solving skills.<br/><br/></p><p>- Bachelors or Masters degree in Computer Science, Information Technology, or a related field.<br/><br/></p><p>- Competitive salary as per industry standards.<br/><br/></p><p>- Opportunity to architect solutions for large-scale, global enterprises.<br/><br/></p><p>- Work with cutting-edge cloud, big data, and AI/ML technologies.<br/><br/></p><p>- Leadership role with career advancement opportunities.</p><br/></p> (ref:hirist.tech)
✨ Smart • Intelligent • Private • Secure
Practice for Any Interview Q&A (AI Enabled)
Predict interview Q&A (AI Supported)
Mock interview trainer (AI Supported)
Ace behavioral interviews (AI Powered)
Record interview questions (Confidential)
Master your interviews
Track your answers (Confidential)
Schedule your applications (Confidential)
Create perfect cover letters (AI Supported)
Analyze your resume (NLP Supported)
ATS compatibility check (AI Supported)
Optimize your applications (AI Supported)
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
European Union Recommended
Institution Recommended
Institution Recommended
Researcher Recommended
IT Savvy Recommended
Trades Recommended
O*NET Supported
Artist Recommended
Researchers Recommended
Create your account
Access your account
Create your professional profile
Preview your profile
Your saved opportunities
Reviews you've given
Companies you follow
Discover employers
O*NET Supported
Common questions answered
Help for job seekers
How matching works
Customized job suggestions
Fast application process
Manage alert settings
Understanding alerts
How we match resumes
Professional branding guide
Increase your visibility
Get verified status
Learn about our AI
How ATS ranks you
AI-powered matching
Join thousands of professionals who've advanced their careers with our platform
Unlock Your Data Architect Potential: Insight & Career Growth Guide
Real-time Data Architect Jobs Trends in Bengaluru, India (Graphical Representation)
Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph below. This graph displays the job market trends for Data Architect in Bengaluru, India using a bar chart to represent the number of jobs available and a trend line to illustrate the trend over time. Specifically, the graph shows 255975 jobs in India and 17690 jobs in Bengaluru. This comprehensive analysis highlights market share and opportunities for professionals in Data Architect roles. These dynamic trends provide a better understanding of the job market landscape in these regions.
Great news! Careerfit.ai is currently hiring and seeking a Data Architect Python/SQL to join their team. Feel free to download the job details.
Wait no longer! Are you also interested in exploring similar jobs? Search now: Data Architect Python/SQL Jobs Bengaluru.
An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at Careerfit.ai adheres to the cultural norms as outlined by Expertini.
The fundamental ethical values are:The average salary range for a Data Architect Python/SQL Jobs India varies, but the pay scale is rated "Standard" in Bengaluru. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.
Key qualifications for Data Architect Python/SQL typically include Computer Occupations and a list of qualifications and expertise as mentioned in the job specification. Be sure to check the specific job listing for detailed requirements and qualifications.
To improve your chances of getting hired for Data Architect Python/SQL, consider enhancing your skills. Check your CV/Résumé Score with our free Resume Scoring Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.
Here are some tips to help you prepare for and ace your job interview:
Before the Interview:To prepare for your Data Architect Python/SQL interview at Careerfit.ai, research the company, understand the job requirements, and practice common interview questions.
Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the Careerfit.ai's products or services and be prepared to discuss how you can contribute to their success.
By following these tips, you can increase your chances of making a positive impression and landing the job!
Setting up job alerts for Data Architect Python/SQL is easy with India Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!