Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Solutions Architect.
India Jobs Expertini

Urgent! Data Solutions Architect Job Opening In Vapi – Now Hiring Bristlecone

Data Solutions Architect



Job description

About the Company


We are seeking an accomplished Data Solutions Architect with deep experience in designing, implementing, and optimizing modern cloud data platforms.

The ideal candidate will expertly leverage Snowflake, Databricks, DBT, Airflow, Python, PySpark, and have hands-on exposure to advanced AI/ML and Generative AI (GenAI) solutions in real-world enterprise environments.

You will partner with business and tech leaders to deliver scalable, innovative data products, architectures, and analytics capabilities.



About the Role



We are looking for a Data Solutions Architect who will be responsible for architecting and implementing scalable data platforms and solutions.



Responsibilities



  • Architect and implement scalable data platforms, pipelines, and solutions using Snowflake, Databricks, DBT, Airflow, Python, and PySpark
  • Lead cloud data migration projects, designing robust, secure, and reusable analytics architectures
  • Develop and optimize ELT/ETL workflows for batch and real-time processing
  • Design dimensional models, data lakes, data marts, and advanced analytics infrastructures
  • Oversee and mentor data engineering teams in coding standards, CI/CD, and best practices
  • Integrate AI/ML models and GenAI capabilities into data products and workflows (e.g., predictive analytics, intelligent automation)
  • Collaborate with business stakeholders to define requirements, translating needs into technical architectures
  • Manage platform cost optimization, data quality/governance, and performance tuning
  • Evaluate and recommend emerging tech, tools, and architectures to drive innovation


Qualifications



  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field
  • 7+ years of professional experience in data architecture, engineering, or analytics roles
  • Advanced hands-on experience with Snowflake (data warehousing, modeling, optimization)
  • Deep expertise in Databricks (Spark, ML workflows, lakehouse architectures) and PySpark
  • Proficient with DBT for data transformation and modeling
  • Expert in building robust workflows with Apache Airflow
  • Strong programming skills in Python (including data and ML libraries)
  • Proven track record implementing AI/ML solutions with at least basic exposure to GenAI (LLMs, prompt engineering, AI API integration)
  • Cloud platform experience (Azure, AWS, or GCP)
  • Excellent problem-solving, communication, and stakeholder management skills


Required Skills



  • Experience with Terraform, CI/CD for data engineering
  • Familiarity with data governance, privacy, and security best practices
  • Prior experience in client-facing solution architect or enterprise delivery roles
  • Certifications in Snowflake, Databricks, Python, or cloud platforms
  • Published work or demonstrated portfolio of data/AI projects
  • Understanding of modern API integration and real-time analytics



Equal Opportunity Statement



We are committed to diversity and inclusivity.


```


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Solutions Potential: Insight & Career Growth Guide