Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Analyst Data Engineering and Automation.
India Jobs Expertini

Urgent! Analyst - Data Engineering and Automation Job Opening In Bengaluru – Now Hiring Confidential

Analyst Data Engineering and Automation



Job description

 

Key Responsibilities :

  • Translation of technical documentation into modules with logic built to handle all scenarios.
  • Developing, configuring, or modifying complex integrated business and/or enterprise infrastructure or application solutions within various computing environments.
  • Work with large, complex, and disparate data sources to implement and maintain financial data pipelines that create FP&A ready tables for complex business analysis on enterprise Big Data Platform (Hadoop on-Prem and GCP Cloud) solutions to ensure the successful deployment.
  • Finance business needs are very agile and hence ensure the solution developed can go through modifications swiftly to cater to business needs and the applications are efficient, scalable, and maintainable.
  • Partner with Enterprise Architecture, Technology Platform and Enterprise Data Governance teams to utilize in house tools while designing solutions and build data pipelines that adhere to data quality and governance standards.
  • Collaborate with retail finance teams to provide data solutions, standardize, and automate process etc and be a true partner while working with stake holders and business counterparts versus task taker.
  • Demonstrate Learning Agility - Actively learning through experimentation when tackling new problems, using both successes and failures as opportunities for learning.
  • Display Financial Acumen - Interpreting and applying understanding of key financial indicators to make better business decisions.

Required Qualifications:

  • bachelors or masters degree in computer science, CIS, or related field
  • Minimum 2 years of experience developing and implementing solutions applying development life cycles (SDLC)
  • Ability to understand technical documentation in a platform development environment and design data pipeline modules and scripts.
  • Experience working with Continuous Integration/Continuous Deployment tools and source code control systems like Bit Bucket.
  • Min 2+ years of experience working on Data Warehouse platforms like Teradata, Hadoop, or any Cloud Big data components.
  • Good experience with Python, SQL, Scripting, Teradata, MicroStrategy, Oracle, MySql, DB2, Hadoop (Sqoop, Hive), PySpark, Airflow etc
  • Experience in GCP Cloud Data Platform (specific to the Data Engineering services like Dataproc, Cloud Composer, Big Query etc)
  • Be willing to grow professionally.


Skills Required
Python, Sql, Scripting, Microstrategy, Data Warehouse


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Analyst Data Potential: Insight & Career Growth Guide