Our client is seeking aÂ Hadoop Specialist to join their excitingÂ organization.Â It will be an interesting opportunity to work & learn multiple technologies in the Big DataÂ space with one of the large, leading financial institutions.
- Degree in Information Technology, Computer Engineering, Computer Science or Information System;
- Minimum 5Â years ofÂ database or systemÂ experience with at least 2Â years inÂ Hadoop;
- KnowledgeÂ of analytics, analytics tools andÂ data architecture (HDFS, Hive, Impala Oozie, YARN, Sqoop, Zookeeper, Flume, Spark, NiFi,Â Kafka);
- Proficient inÂ Linux, shell and other scripting (ksh, bash,Â Python);
- Experience with design, management, implementation of Backup, Disaster Recovery and/or High Availability solutions;
- Experience in systems and network troubleshooting (in local or virtual infrastructure framework);
- Experience in designing andÂ deployingÂ Hadoop clusters (Apache Foundation, Cloudera, MapR, Hortonworks).
- Experience working in the financial servicesÂ domain;
- Experience with virtualization or cloud (VMware, AWS, and Azure).
- Involve in planning, monitoring and managementÂ of theÂ Hadoop clusters;
- Conduct performance assessment, troubleshoot, as well as remediationÂ planning for the Hadoop infrastructure and applications;
- Ensure that the HDFSÂ is working optimally at all times;
- Perform scripting for automatingÂ theÂ deploymentÂ and management of theÂ Hadoop clusters;
- Regulating the administration rights depending on job profile of users;
- ProvideÂ architectural solution design and recommendations to support Data and Analytics programs;
- ReviewÂ the security aspect of the Hadoop solutions ensuring adherence to security principles and standards;
- Partner with vendors for evaluation on various tools andÂ solutions.