Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Big Data/Hadoop Administrator.
India Jobs Expertini

Urgent! Big Data/Hadoop Administrator Job Opening In Indore – Now Hiring ClearTrail Technologies

Big Data/Hadoop Administrator



Job description

At ClearTrail, work is more than ‘just a job’.

Our calling is to develop solutions that empower those dedicated to keeping their people, places and communities safe.

For over 23 years, law enforcement & federal agencies across the globe have trusted ClearTrail as their committed partner in safeguarding nations & enriching lives.

We are envisioning the future of intelligence gathering by developing artificial intelligence and machine learning-based lawful interception & communication analytics solutions that solve the world’s most challenging problems.
Role- Big Data/Hadoop Administrator
Location – Indore, MP
Experience Required – 3 to 5 Years
What is your Role?
You will work in a multi-functional role with a combination of expertise in System and Hadoop administration.

You will work in a team that often interacts with customers on various aspects related to technical support for deployed system.

You will be deputed at customer premises to assist customers for issues related to System and Hadoop administration.

You will Interact with QA and Engineering team to co-ordinate issue resolution within the promised SLA to customer.
What will you do?
- Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem.
- Installing Linux Operating System and Networking.
- Writing Unix SHELL/Ansible Scripting for automation.
- Maintaining core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, HBASE etc.
- Takes care of the day-to-day running of Hadoop clusters using Ambari/Cloudera manager/Other monitoring tools, ensuring that the Hadoop cluster is up and running all the time.
- Maintaining HBASE Clusters and capacity planning.
- Maintaining SOLR Cluster and capacity planning.
- Work closely with the database team, network team and application teams to make sure that all the big data applications are highly available and performing as expected.
- Manage KVM Virtualization environment.
Must Have Skills -
- Technical Domain: Linux administration, Hadoop Infrastructure and Administration, SOLR, Configuration Management (Ansible etc).
- Linux Administration
- Experience in Python and Shell Scripting
- Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem
- Knowledge of Hadoop core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK etc.
- Knowledge of HBASE Clusters
- Working knowledge of SOLR, Elastic Search
Good to Have Skills:
- Experience in Networking Concepts
- Experience in Virtualization technology, KVM, OLVM


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Big Data Potential: Insight & Career Growth Guide