Senior level hadoop and big data engineer needed with the following skills:
6-10 years of relevant working experience.
Excellent knowledge of Hadoop architecture and administration and support; Experience with Cloudera Hadoop preferred.
Proficient in Map-Reduce, Pig, Hive, Python, and shell scripting; experience with Chef a plus.
Expert understanding of ETL principles and how to apply them within Hadoop
Ability to learn, apply, and explain new tools and technologies quickly.
Experience managing 100+ node cluster.
Proficient in Linux operating systems.
Deployment Architecture definition and documentation for a Hadoop based production environment that can scale to petabytes
Deploy, administer and manage Hadoop Software on large cluster implementations
Install and configure Hadoop based monitoring tools
At least 2-3 years of dedicated development experience in Java-Mapreduce, Hive, PIG, Sqoop, Flume, HBASE, Cassandra, MangoDB, CouchDB
At least 2 years of experience in working in VLDB (Very large DataBase) environment.
Should be able to define code pieces, components required to implement a use-case and also be able to do quick evaluation of tools for POC and provide demos to customer teams.
-Expert level knowledge of MapReduce
-Expert on Java/J2ee