Job Title: DevOps Engineer
Company: Omnipresent Robot Technologies Pvt.
Ltd.
Location: Noida Type: Full-Time
About: Omnipresent Robot Tech Pvt.
Ltd is a dynamic and innovative SNIoE faculty and staff startup in the field of robots/drones/Space Tech.
Our team is driven by a passion for pushing boundaries and creating groundbreaking solutions for real-world challenges.
Our recent product was a perception and navigation module for Pragyaan rover for ISRO’s Chandrayaan 3 mission.
Position Overview: We are currently looking for DevOps Engineer, to contribute to our satellite-based defense project.
The selected candidate will play a key role in the deployment and management of the work and will work closely with our experienced team to design and develop a memory architecture and test the ML models that are critical to the success of our satellite module.
This job provides a unique opportunity to work in a start-up culture, where innovation, creativity, and problem-solving skills are highly valued.
Responsibilities: Define and manage the data architecture framework, standards, and principles, including modeling, metadata, security, reference, and master data.
Create solution frameworks integrating large, optical or complex data sets
✔ Lead all data modeling efforts within DataBricks, including the design of data structures and the identification of business transformation logic.
✔ Creating logical data models based on existing applications and databases to accommodate Data Fusion.
Making build and deployment automation enhancements, fixing bugs.
✔ Transform the logical representation of the model into a physical representation and work with the data engineering team to instantiate and manage the data.
✔ Contribute to innovation and company strategies by exploring, investigating, recommending, benchmarking, and implementing new data-centric technologies for the platform and suggesting data modeling in other projects when needed.
✔ Build the infrastructure and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and/or other data technologies.
✔ Work with data engineering and analytics experts to strive for greater functionality in our data systems and incorporation of industry best practices.
✔ Building CI/CD pipelines using Jenkins/Azure DevOps and troubleshooting DevOps-related issues/bugs.
✔ Design and build the best suitable branching strategies for pipelines.
Responsible for managing, supporting, and troubleshooting existing pipelines.
✔ Providing best practices for tools and technologies as per project requirements.
Bridging the gap between Developers and Admins to adopt the best-fit tooling.
Qualifications: Suitable Candidate will have a B.
Tech degree in CSE/IT (other disciplines are also considerable).
Candidate must have relevant projects and working experience in the field of Data Science and DevOps.
3 – 5 years of professional experience is necessary.
Technical Skills: ✔ Detailed knowledge of data warehouse/Lakehouse technical architecture concepts, infrastructure components, and ELT process.
✔ Proficiency in Python, C++, or other scripting language.
✔ Familiarity with API (REST/RESTful)
✔ Experience with CI/CD Pipeline maintenance and deployment
✔ Extensive and hands-on experience in Jenkins/ Azure DevOps/GCP for data engineering, and ML Ops will be beneficial for your profile;
Knowledge and ability to use Big data technologies (Hadoop, Spark, Kafka, Presto, etc).
✔ Experience with AWS.
In-depth understanding of relevant data engineering best practices.
✔ Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases, and technologies.
Interpersonal Skills:
✔ Ability to thrive in a fast-paced environment, responding quickly to oversight and following instructions closely, while anticipating needs and developing efficiencies on repeat tasks.
✔ Excellent problem-solving, critical thinking, and analytics skills with the ability to work in a cross-functional team.
✔ A quick self-motivated learner who can initiate and drive new projects in anticipation of management and client goals and objectives with little or no supervision.
✔ Self-directed and comfortable supporting the data needs of multiple teams, systems, and products.
✔ Must be able to adapt to changes in direction