Role Responsibilities:
- Design and develop scalable data pipelines using MS Fabric to support business intelligence and analytics needs.
- Build and optimize data models that facilitate effective data storage and retrieval.
- Manage ETL (Extract, Transform, Load) processes ensuring efficient data extraction, transformation, and loading.
- Collaborate with cross-functional teams to gather and define comprehensive data requirements.
- Ensure data quality, integrity, and consistency across all data processes.
- Implement and enforce best practices for data management, storage, and processing.
- Conduct performance tuning for data storage systems and query execution to enhance efficiency.
- Create and maintain detailed documentation for data architecture, workflows, and processes.
- Troubleshoot data-related issues and implement timely and effective solutions.
- Monitor and optimize cloud-based data solutions for scalability and resource efficiency.
- Research and evaluate emerging data engineering tools and technologies for project incorporation.
- Assist in designing and enforcing data governance frameworks and policies.
- Provide technical guidance and mentorship to junior data engineers.
- Participate in code reviews to ensure adherence to coding standards and quality.
- Stay updated on industry trends and best practices in data engineering and analytics.
Qualifications:
- Minimum of 8 years of experience in data engineering or related roles.
- Strong expertise and hands-on experience with MS Fabric and its ecosystem.
- Proficiency in SQL and experience working with relational database management systems.
- Solid experience in data warehousing solutions and data modeling techniques.
- Hands-on experience with ETL tools and data integration processes.
- Familiarity with major cloud computing platforms such as Azure, AWS, and GCP.
- Working knowledge of Python or other programming languages commonly used in data engineering.
- Proven ability to communicate complex technical concepts to non-technical stakeholders clearly.
- Experience implementing data quality measures and data governance practices.
- Excellent problem-solving skills and a keen attention to detail.
- Ability to work independently in remote and distributed team environments.
- Experience with data visualization tools is advantageous.
- Strong analytical and organizational skills.
- Bachelor's degree in Computer Science, Engineering, or a related discipline.
- Familiarity with Agile methodologies and project management practices.
Skills Required
Data Modeling, Azure Data Factory, Sql Server