Our client is seeking a Hadoop Specialist to join their exciting organization. It will be an interesting opportunity to work & learn multiple technologies in the Big Data space with one of the large, leading financial institutions.

Mandatory Skill(s)

  • Degree in Information Technology, Computer Engineering, Computer Science or Information System;
  • Minimum 5 years of database or system experience with at least 2 years in Hadoop;
  • Knowledge of analytics, analytics tools and data architecture (HDFS, Hive, Impala Oozie, YARN, Sqoop, Zookeeper, Flume, Spark, NiFi, Kafka);
  • Proficient in Linux, shell and other scripting (ksh, bash, Python);
  • Experience with design, management, implementation of Backup, Disaster Recovery and/or High Availability solutions;
  • Experience in systems and network troubleshooting (in local or virtual infrastructure framework);
  • Experience in designing and deploying Hadoop clusters (Apache Foundation, Cloudera, MapR, Hortonworks).

Desirable Skill(s)

  • Experience working in the financial services domain;
  • Experience with virtualization or cloud (VMware, AWS, and Azure).


  • Involve in planning, monitoring and management of the Hadoop clusters;
  • Conduct performance assessment, troubleshoot, as well as remediation planning for the Hadoop infrastructure and applications;
  • Ensure that the HDFS is working optimally at all times;
  • Perform scripting for automating the deployment and management of the Hadoop clusters;
  • Regulating the administration rights depending on job profile of users;
  • Provide architectural solution design and recommendations to support Data and Analytics programs;
  • Review the security aspect of the Hadoop solutions ensuring adherence to security principles and standards;
  • Partner with vendors for evaluation on various tools and solutions.
Apply to this Job