We are looking for Hadoop expert to help us in some hadoop/bigdata work. The consultant could be anywhere, we are open to work with him/her remotely. We request from 10-20 hours a week for three-six months.
Performance Tune a Cloudera Hadoop 5.8 cluster to run map-reduce jobs efficiently
Add in-memory sharing capability to the Hadoop cluster so that jobs common computation libraries
Setup the web-service job submission for Hadoop
Improve the interaction between Cloudera and MongoDB
Bachelor's Degree, computer science or related field
Master's Degree, computer science or related field a plus
Required Skills and Qualifications
Hands on experience working on the Spring Framework
Hands on experience working with Mongo DB
Hands on experience developing and performance tuning Map-Reduce jobs to perform computation on a Cloudera Hadoop cluster on Linux nodes
Hands on experience in Installation, configuration and Maintenance of Hadoop ecosystems like HDFS, YARN/MRV2, Hive, Pig, Zookeeper, HBase, Oozie, Sqoop, Hue, Flume and Spark.
Hands on experience Installing and configuring multiple Cloudera Hadoop clusters
Hands on experience in administering Hadoop clusters using Cloudera Manager including activities like deploy Hadoop cluster, add/remove services, add/remove nodes
Desired Skills and Qualifications
Preferred experience deploying Hadoop on Azure