Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: OneMagnify Databricks Engineer Artificial Intelligence.
India Jobs Expertini

Urgent! OneMagnify - Databricks Engineer - Artificial Intelligence Job Opening In Chennai – Now Hiring Onemagnify

OneMagnify Databricks Engineer Artificial Intelligence



Job description

<p><p>Administers, maintains, develops and implements policies and procedures for ensuring the security and integrity of the company databases.<br/><br/> Implements data models and database designs, data access and table maintenance; resolves database performance issues, database capacity issues, replication, and other distributed data issues.</p><p><br/><b>Job Description :</b><br/><br/><b>ESSENTIAL DUTIES AND RESPONSIBILITIES :</b><br/><br/>- Programming & Query Languages: Proficient in Python, and SQL for data analysis, modeling, and database management.<br/><br/>- Machine Learning & AI Models: Experience with classical ML (e.g., regression, decision trees, SVM), deep learning (CNNs, RNNs, Transformers), and advanced AI (GNNs, RL, GANs, LLMs like GPT-4, Claude, LLaMA).<br/><br/>- Natural Language & Vision AI: Skilled in NLP (NER, sentiment analysis, summarization) and computer vision (YOLO, ResNet, segmentation, OCR).<br/><br/>- Frameworks & Libraries: Hands-on with TensorFlow, PyTorch, Scikit-learn, Keras, Hugging Face Transformers, and Spark MLlib.<br/><br/>- Databricks Ecosystem: Proficient in Databricks Notebooks, MLflow, AutoML, Delta Lake, Unity Catalog, Feature Store, Model Serving, and Lakehouse AI.<br/><br/>- Data Engineering & Big Data: Experience with Apache Spark, Hadoop, Kafka, Airflow, dbt, and building scalable ETL pipelines.<br/><br/>- Cloud Platforms: Deploy and manage solutions on AWS (S3, SageMaker), Azure (ML, Synapse), and GCP (BigQuery, Vertex AI).<br/><br/>- Statistical Analysis & Experimentation: Apply A/B testing, hypothesis testing, Bayesian methods, and time series forecasting (ARIMA, Prophet, LSTM).<br/><br/>- Model Deployment & DevOps: Deploy models using Flask, FastAPI, Docker, Kubernetes, CI/CD pipelines, and integrate with APIs and Git for version : </p><p><br/></p><p>- To perform this job successfully, an individual must be able to perform each essential duty satisfactorily.<br/><br/>- The requirements listed below are representative of the knowledge, skill, and/or ability required.<br/><br/>- Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.</p><p><br/><b>EDUCATION AND/OR EXPERIENCE</b> : </p><p><br/></p><p>- Bachelors Degree (B.) from four-year college or university.<br/><br/>- In-depth understanding of all aspects of database technology with minimum five years experience in database administration or equivalent job experience.<br/><br/>- Experience with UNIX, Linux, and Microsoft Windows operating systems.</p><br/></p> (ref:hirist.tech)


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your OneMagnify Databricks Potential: Insight & Career Growth Guide