Hadoop Jobs

54 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Expert ($$$) - Est. Time: Less than 1 week, 30+ hrs/week - Posted
Hi, I'm looking for someone with experience writing complex Hive queries/scripts for large (multi-hundred terabyte) datasets. Should also be experienced in Java for writing Hive UDFs, Amazon Web Services (EC2), and having previously worked in Qubole is a plus (but not required). To be considered, please share the most challenging large-scale issue you ever faced with Hive, and how you overcame it (be sure to include the node count and approximate disk space required). Thank you.
Skills: Hadoop Amazon Web Services Apache Hive
Fixed-Price - Intermediate ($$) - Est. Budget: $2,400 - Posted
Open source Big data platforms (Hadoop, Cloudera, Spark, Hortonworks, Cassandra etc.) 2. Hadoop Ecosystem – HDFS, Hive, Pig, NoSQL, Flume, HBase, Solr, Impala, Yarn, Hue, Sqoop, Oozie, MapR etc. 3. ... Data transformation involving encryption and decryption through RDBMS – Hadoop – RDBMS 5. MapReduce programming experience for a. sorting and consolidating data from multiple sources & formats to a normalized structure b. generating reports using complex calculations and aggregations at frequent time intervals 6.
Skills: Hadoop Big Data MapReduce
Hourly - Intermediate ($$) - Est. Time: 3 to 6 months, 10-30 hrs/week - Posted
Our company has several projects related to Big Data & Hadoop in delivery pipeline. We are looking for solid resources with excellent skills on BigData & Hadoop (Spark, Hive, Pig, Kafka, Storm, Elastic Search, Solr, Python, Scala) for ongoing work with our clients in US and world wide. ... We are looking for solid resources with excellent skills on BigData & Hadoop (Spark, Hive, Pig, Kafka, Storm, Elastic Search, Solr, Python, Scala) for ongoing work with our clients in US and world wide. ... If you are interested, please send us links to your portfolio on Big Data & Hadoop and why should consider you for this role. Looking for a long term relationship.
Skills: Hadoop Apache Hive Apache Kafka Apache Solr
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
I am looking for someone who is Cloudera Admin certified . And who can take me along the exam path .How to start preparing ,what to focus . It will be an hourly job. We will use skype as primary mode of communication.
Skills: Hadoop Cloudera MapReduce
Fixed-Price - Expert ($$$) - Est. Budget: $160 - Posted
A set of new questions (around 100) based on Hadoop is to be created for assessments. The questions have to be MCQs, and must be rich in quality.
Skills: Hadoop
Fixed-Price - Expert ($$$) - Est. Budget: $100 - Posted
Hi We have a group of developers, Architects so we want to offer them some course on any technology. So this course can be of any duration an hour, a day, 15days, 1month so we need any suggestion from your side to offer them a course let us know what kind of course you can run which can be helpful for our group members for more we can discuss just apply and lets see if we can run the course. Few Topics we can list here : Java 8 Topics Micro Services Agile Methodology TDD - Test Driven Development BDD - Behaviour Driven Development Cucumber etc. Thanks
Skills: Hadoop Agile software development Architecture Behavior Driven Development (BDD)
Fixed-Price - Expert ($$$) - Est. Budget: $150 - Posted
To create a multiple choice questions test on Data Scientist ,it is an interdisciplinary field about processes and systems to extract knowledge or insights from data in various forms, either structured or unstructured, which is a continuation of some of the data analysis fields such as statistics, data mining, and predictive analytics etc. Questions count – 60 The quality of questions should be such that only the candidates well aware about the practical aspects of the subject should be able to pass. Type - Questions divided into three levels-easy, medium, hard Multiple Choice questions required to be developed with copyright free original content . The questions can have MCQ's based on practical knowledge as well on the topic. The questions to have minimum four answer options each. No True and False Statements. No Trivia questions. Review: The subject matter expert will be required to review the test on the live server and point out any corrections, once the test is uploaded Feedback: The subject matter expert will help us in resolving any feedback received from the test takers within 3 months of uploading the test. Steps: Deliverable: Required in a word file in the required format (format will be provided) 1. Signing the NDA 2. Create outline/syllabus/sub-topics 3. Develop questions as per the agreed upon test outline. The questions will be grouped under sub-topics by the subject matter expert as described in step 2. 4. Submit the deliverable in a word file 5. Run the test and review the questionnaire once live 6. Point out any improvements/errors during the final review.
Skills: Hadoop Analytics Big Data Data mining