The Hadoop Engineer will work with a team of technology and business data specialists to execute the technology functions required to establish data environments, develop data maps, extract and transform data, analyze and reconcile data errors and anomalies. As part of PTi's Delivering Services organization, you will work with project managers in the management of PTi client Hadoop clusters and environments.
KEY JOB FUNCTIONS
- Participate with team of technical staff and business managers or practitioners in the business unit to determine systems requirements and functionalities needed in large/complex development project.
- Assess and develop high level design requirements for project and communicate in writing or in meetings with development team. Assess detailed specifications against design requirements.
- Review coding done to advance application upgrade, extension, or other development. Analyze application for data integrity issues.
- Develop test protocols or plan for testing revised application and review test results.
- Serve as project lead or lead technical staff in course of application development project
- 6+ years of related experience
SPECIALIZED KNOWLEDGE & SKILLS
- 6+ years of hands on experience and strong and deep knowledge of Java application development
- 4+ years of hands on experience in LINUX, Java/J2EE, SOA and Oracle platforms
- Experience processing large amounts of structured and unstructured data. MapReduce experience is a plus.
- 1 to 2 years of building and coding applications Using Hadoop Components – HDFS, Hbase, Hive, Sqoop, Kafka, Storm, etc
- 1 to 2 years coding Java MapReduce, Python, Pig programming, Hadoop Streaming and HiveQL
- 1 – 2 years implementing relational data models
- Minimum 1 year experience developing REST web services
- Experience with Agile/Scrum development methodologies
- Experience leading and managing large scale, complex applications with high performance needs
- Excellent written and verbal communication skills