Job Title: Data Integration Engineer
Location: (Add Location or mention “Remote/Hybrid”)
Company: LumenData
Role Overview
We are seeking a Data Integration Engineer with expertise in building and orchestrating data pipelines using Apache Airflow to integrate data from diverse sources into Snowflake .
The ideal candidate will have strong experience with JDBC and API-based integrations (REST/JSON) , hands-on proficiency with Postman , and solid skills in SQL encryption/decryption , Python development , and pipeline monitoring .
Key Responsibilities
- Design, develop, and maintain Airflow DAGs to orchestrate end-to-end data workflows.
- Integrate structured and unstructured data from multiple systems into Snowflake using JDBC connectors , APIs , and flat-file ingestion .
- Use Postman and other tools to test, validate, and automate API integrations.
- Implement SQL encryption/decryption techniques to protect sensitive datasets.
- Perform data quality checks , including row-level validation, hash-based reconciliation, and exception handling.
- Develop transformation logic using Python and SQL , ensuring performance, scalability, and maintainability.
- Implement detailed logging , monitoring , and alerting to ensure pipeline reliability and compliance.
- Collaborate with stakeholders to understand requirements and deliver scalable, production-ready solutions .
Required Skills & Experience
- Strong proficiency in Apache Airflow for workflow orchestration.
- Hands-on experience with Snowflake as a data warehouse.
- Proven ability to integrate data via JDBC drivers , REST APIs , and Postman-tested endpoints .
- Advanced knowledge of SQL , including encryption/decryption techniques .
- Strong programming skills in Python for ETL/ELT development .
- Experience with logging , monitoring , and data observability practices.