Mapreduce Jobs

13 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Intermediate ($$) - Est. Budget: $2,400 - Posted
Data transformation involving encryption and decryption through RDBMS – Hadoop – RDBMS 5. MapReduce programming experience for a. sorting and consolidating data from multiple sources & formats to a normalized structure b. generating reports using complex calculations and aggregations at frequent time intervals 6.
Skills: MapReduce Big Data Hadoop
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
I am looking for someone who is Cloudera Admin certified . And who can take me along the exam path .How to start preparing ,what to focus . It will be an hourly job. We will use skype as primary mode of communication.
Skills: MapReduce Cloudera Hadoop
Hourly - Intermediate ($$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
-Strong ROS/Python C/C++ programming skills -Good understanding of private and public cloud design considerations and limitations in the areas of virtualization and global infrastructure, distributed systems, load balancing and networking, massive data storage, Hadoop, MapReduce, and security. -Experience with open source cloud and application platforms including Google App Engine, Amazon EC2 and Microsoft Azure.
Skills: Computer vision Machine learning Robotics
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
Additionally, I would like to use the code with a mapReduce and PCA and without it to see the difference. Please read the attached file and if you have any question please come back to me.
Skills: MapReduce PCAP Python
Hourly - Expert ($$$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
(e.g. HDFS, S3, Hive, Pig Hadoop and MapReduce) • Regular commits to Git and GitHub required – minimum, daily commits a must
Fixed-Price - Intermediate ($$) - Est. Budget: $15 - Posted
I have a Spark job and a MapReduce job. I also have different types of data e.g. Avro, JSON, etc... and they are also compressed with a few different compressions. ... I need your advice on how to carry out this test on both jobs. I know MapReduce framework provided a lot of counters, but I'm not sure how to translate them so I need your help on that as well.
Skills: MapReduce Apache Spark Hadoop Performance testing
Hourly - Intermediate ($$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Expertise with HADOOP environment using technologies such as HDFS, MapReduce, Pig, Hive, HBase, ZooKeeper, SAP HANA, NoSQL, HBase, Cassandra, MongoDB, and Riak, Storm and other Big Data technologies.
Skills: Hadoop REST Web Services