Mapreduce Jobs

6 were found based on your criteria {{|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Intermediate ($$) - Est. Budget: $2,400 - Posted
Data transformation involving encryption and decryption through RDBMS – Hadoop – RDBMS 5. MapReduce programming experience for a. sorting and consolidating data from multiple sources & formats to a normalized structure b. generating reports using complex calculations and aggregations at frequent time intervals 6.
Skills: MapReduce Big Data Hadoop
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
I am looking for someone who is Cloudera Admin certified . And who can take me along the exam path .How to start preparing ,what to focus . It will be an hourly job. We will use skype as primary mode of communication.
Skills: MapReduce Cloudera Hadoop
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
Additionally, I would like to use the code with a mapReduce and PCA and without it to see the difference. Please read the attached file and if you have any question please come back to me.
Skills: MapReduce PCAP Python
Fixed-Price - Intermediate ($$) - Est. Budget: $15 - Posted
I have a Spark job and a MapReduce job. I also have different types of data e.g. Avro, JSON, etc... and they are also compressed with a few different compressions. ... I need your advice on how to carry out this test on both jobs. I know MapReduce framework provided a lot of counters, but I'm not sure how to translate them so I need your help on that as well.
Skills: MapReduce Apache Spark Hadoop Performance testing
Fixed Price Budget - Expert ($$$) - $100 to $120 - Posted
In the role of Technology Lead, you will interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation, Application Architecture definition and Design. You will play an important role in creating the high level design artifacts. You will also deliver high quality code deliverables for a module, lead validation for all types of testing and support activities related to implementation, transition and warranty. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued. Qualifications Basic Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education. At least 4 years of experience with Information Technology. Preferred At least 4 years of solid experience in the software industry with strong experience in Big Data Technologies. Strong Expertise working in / Understanding of Big data a technologies with strong focus on HortonWorks. Strong knowledge of Kafka, Flume, Sqoop, Hive, Pig, Map Reduce, Spark, Storm with hands on experience – all or most of these. Solid technology expertise in J2EE and related technologies Strong Unix Scripting skills Be very comfortable with Agile methodologies Good knowledge of Data warehouse and BI technologies – Exposure to ETL and Reporting tools and appliances like Teradata. Excellent Communication Skills. Expertise in SQL databases (e.g. MySQL or Oracle) and strong ability to write SQL queries. Proven ability to lead a team of engineers. Ability to work in onsite/offshore model
Skills: MapReduce Apache Hive Apache Kafka Apache Spark
Hourly - Entry Level ($) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Hi , I am having some client base in various regions and they are looking for various big data enthusiast who are experienced in various components such as Hadoop , Spark , Elasticsearch , Logstash , Kibana , Pig , Hive etc. Kindly shoot out your proposals and convenient time to discuss further with your skype ID Also write "Big Data enthu" on top and bottom of proposals so that I can come to know there are genuine proposals.
Skills: MapReduce Amazon EC2 Apache Flume Apache Hive