Job Description
<p><p><b>About the job :</b><br/><br/><b>About Straive :</b><br/><br/>Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains.<br/><br/>Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the companys long-term vision.<br/><br/>The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics.<br/><br/>Straive continues to be the leading content services provider to research and education publishers.<br/><br/>Data Analytics & Al Services : Our Data Solutions business has become critical to our client's success.<br/><br/>We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows.<br/><br/>As our clients expect us to become their future-fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them.<br/><br/>With a client-base scoping 30 countries worldwide, Straives multi-geographical resource pool is strategically located in eight countries - India, Philippines, USA, Nicaragua, Vietnam, United Kingdom, and the company headquarters in Singapore.<br/><br/><b>Role Overview :</b><br/><br/>We are seeking a Data Platform Operations Engineer to join us in building, automating, and operating our Enterprise Data Platform.<br/><br/>This role is ideal for someone with a unique combination of DataOps/DevOps, Data Engineering, and Database Administration expertise.<br/><br/>As a key member of our Data & Analytics team, you will ensure our data infrastructure is reliable, scalable, secure, and high-performingenabling data-driven decision-making across the business.<br/><br/><b>Key Responsibilities :</b><br/><br/>- Snowflake Administration : Own the administration, monitoring, configuration, and optimization of our Snowflake data warehouse.
Implement and automate user/role management, resource monitoring, scaling strategies, and security policies.</p><p><br/>- Fivetran Management : Configure, monitor, and troubleshoot Fivetran pipelines for seamless ingestion from SaaS applications, ERPs, and operational databases.
Resolve connector failures and optimize sync performance and cost.<br/><br/>- DataOps/Automation : Build/improve CI/CD workflows using Git and other automation tools for data pipeline deployment, testing, and monitoring.<br/><br/>- Infrastructure as Code (IaC) : Implement and maintain infrastructure-using tools like Terraform and Titan to ensure consistent, repeatable, and auditable environments.<br/><br/>- Platform Monitoring & Reliability : Implement automated checks and alerting across Snowflake, Fivetran, and dbt processes to ensure platform uptime, data freshness, and SLA compliance.
Proactively identify and resolve platform issues and performance bottlenecks.<br/><br/>- Database Performance and Cost Optimization : Monitor and optimize database usage (queries, compute, storage) for speed and cost-effectiveness.
Partner with data engineers and analysts to optimize SQL and refine warehouse utilization.<br/><br/>- Security & Compliance : Enforce security best practices across the data platform (access controls, encryption, data masking).
Support audits and compliance requirements (e.g., SOC2).<br/><br/>- Data Quality Operations : Build and automate data health and quality checks (using dbt tests and/or custom monitors).
Rapidly triage and resolve data pipeline incidents with root cause analyses.<br/><br/>- Documentation & Process : Ensure all operational procedures (run books, escalation paths, knowledge base) and infrastructure documentation are accurate, up-to-date, and easily accessible.<br/><br/>- Collaboration : Partner with Data Architects, Data Engineers, and DevOps Engineers to understand data flow requirements, troubleshoot issues, and continuously enhance platform capabilities.<br/><br/><b>Required Experience & Skills :</b><br/><br/>- 5+ years in a DataOps, DevOps, Data Engineering, or Database Administration role in cloud data environments.<br/><br/>- Hands-on experience administering Snowflake, including security, performance tuning, cost management, and automation.<br/><br/>- Strong expertise with Fivetran setup, management, and incident troubleshooting.<br/><br/>- Proficiency in dbt for ELT development, testing, and orchestration.<br/><br/>- Advanced SQL skills for troubleshooting, diagnostics, and optimization.<br/><br/>- Proficient with version control (Git) and experience designing/deploying data pipelines in a collaborative environment.<br/><br/>- Scripting skills (Python, Bash, etc.) for workflow automation, data operations tasks, and deployment pipelines.<br/><br/>- Experience with cloud platforms (AWS/Azure); knowledge of core services such as IAM, data storage, and data transfer.<br/><br/>- Strong understanding of platform reliability, monitoring, and observability (alerting, dash boarding, log analysis).<br/><br/>- Comfortable with Infrastructure as Code concepts and tools (Terraform).<br/><br/>- Experience working with business and analytics teams to translate ops support needs into scalable technical solutions.<br/><br/><b>Technical Stack :</b><br/><br/>- Required : Snowflake, Terraform, Github Actions, AWS, dbt, Fivetran, Preferred Titan, Datacoves</p><br/></p> (ref:hirist.tech)