Job description
Key Responsibilities:
Reliability & Monitoring
- Ensure uptime and continuity of the data lake.
- Proactively monitor database health, backups, and data integrations.
- Troubleshoot and resolve issues to avoid disruptions.
Database & Query Optimization
- Optimize PostgreSQL queries, indexing, and table structures for BI dashboards and AI/ML workflows.
- Design and maintain schemas, partitions, and materialized views to improve performance.
- ETL & Integrations
- Maintain and monitor ETL pipelines bringing data from Salesforce, Coda, Wind Insights, and Sake into Postgres.
- Ensure accuracy, consistency, and timeliness of ingested data.
- Support integration with ActiveMQ or other messaging systems for real-time workflows.
Governance & Documentation
- Implement data governance practices: version control, documentation, and access control.
- Build and maintain a data dictionary and ERD for discoverability.
- Create and manage a central repository with cross-language (SQL, Python, R, etc.) query examples and best practices for developer enablement.
Collaboration & Support
- Work cross-functionally with business, analytics, and AI/ML teams to support their data needs.
- Partner with DevOps and Security teams to ensure compliance with SOC2, GDPR, and internal standards.
Must Have Qualifications
● Bachelor’s degree in Computer Science, Information Systems, or related field.
● 3+ years of experience in database administration or data engineering.
● Hands-on experience with data lake technologies (AWS S3/Glue/Redshift, Azure Data Lake/Synapse, GCP BigQuery).
● Expertise in PostgreSQL optimization, schema design, and query performance tuning.
● Experience with ActiveMQ or other messaging systems.
● Solid understanding of ETL processes, data pipelines, and SaaS integrations (Salesforce, etc.).
● Familiarity with data governance practices (version control, documentation, access control).
● Strong problem-solving skills and ability to work with business, analytics, and AI/ML teams.
● Experience building documentation, data dictionaries, and developer repositories.
Nice to Have
● Familiarity with BI tools (Tableau, Looker, Power BI).
● Experience with AI/ML feature engineering workflows.
● Background in monitoring frameworks (Prometheus, Grafana, Datadog).
Required Skill Profession
Computer Occupations