Apache Hive Jobs

15 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Intermediate ($$) - Est. Budget: $200 - Posted
Hi I have a base code which works on cloudera and I am planning to implement it in AWS EMR. I had done the data integration part with AWS, Hive and Hbase. I need help in connecting the service with java API.
Skills: Apache Hive Hadoop HBase Java
Fixed-Price - Expert ($$$) - Est. Budget: $150 - Posted
To create a multiple choice questions test on Hortonworks (Hadoop Ecosystems) Questions count – 60 The quality of questions should be such that only the candidates well aware about the practical aspects of the subject should be able to pass. Type - Questions divided into three levels- medium, hard,too hard Multiple Choice questions required to be developed with copyright free original content . The questions can have MCQ's based on practical knowledge(image based,code snippets) as well on the topic. The questions to have minimum four answer options each. No True and False Statements. No Trivia questions. Review: The subject matter expert will be required to review the test on the live server and point out any corrections, once the test is uploaded Feedback: The subject matter expert will make revisions to the feedback shared for the content developed before final approval. The subject matter expert will help us in resolving any feedback received from the test takers within 3 months of uploading the test. Steps: Deliverable: Required in a word file in the required format (format will be provided) 1. Signing the NDA 2. Develop questions as per the agreed upon test outline. 3. Submit the deliverable in a word file 4. Run the test and review the questionnaire once live 6. Point out any improvements/errors during the final review.
Skills: Apache Hive Apache Flume Hadoop HBase
Fixed-Price - Expert ($$$) - Est. Budget: $150 - Posted
To create a multiple choice questions test on Cloudera (Hadoop Ecosystems) Questions count – 60 The quality of questions should be such that only the candidates well aware about the practical aspects of the subject should be able to pass. Type - Questions divided into three levels- medium, hard,too hard Multiple Choice questions required to be developed with copyright free original content . The questions can have MCQ's based on practical knowledge(image based,code snippets) as well on the topic. The questions to have minimum four answer options each. No True and False Statements. No Trivia questions. Review: The subject matter expert will be required to review the test on the live server and point out any corrections, once the test is uploaded Feedback: The subject matter expert will make revisions to the feedback shared for the content developed before final approval. The subject matter expert will help us in resolving any feedback received from the test takers within 3 months of uploading the test. Steps: Deliverable: Required in a word file in the required format (format will be provided) 1. Signing the NDA 2. Develop questions as per the agreed upon test outline. 3. Submit the deliverable in a word file 4. Run the test and review the questionnaire once live 6. Point out any improvements/errors during the final review.
Skills: Apache Hive Apache Flume Hadoop HBase
Fixed-Price - Expert ($$$) - Est. Budget: $150 - Posted
To create a multiple choice questions test on Hadoop Ecosystems -Generic Questions count – 60 The quality of questions should be such that only the candidates well aware about the practical aspects of the subject should be able to pass. Type - Questions divided into three levels- medium, hard,too hard Multiple Choice questions required to be developed with copyright free original content . The questions can have MCQ's based on practical knowledge(image based,code snippets) as well on the topic. The questions to have minimum four answer options each. No True and False Statements. No Trivia questions. Review: The subject matter expert will be required to review the test on the live server and point out any corrections, once the test is uploaded Feedback: The subject matter expert will make revisions to the feedback shared for the content developed before final approval. The subject matter expert will help us in resolving any feedback received from the test takers within 3 months of uploading the test. Steps: Deliverable: Required in a word file in the required format (format will be provided) 1. Signing the NDA 2. Develop questions as per the agreed upon test outline. 3. Submit the deliverable in a word file 4. Run the test and review the questionnaire once live 6. Point out any improvements/errors during the final review.
Skills: Apache Hive Apache Flume Hadoop HBase
Hourly - Expert ($$$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
Thank you for the interest in our available position. We want to take this opportunity to congratulate you on making it past our initial screening process! As the next step we would like to get a better idea of your technical and problem solving skills by asking you to show some of your previous work. If we are impressed, the final step will be to schedule two interviews (one technical and one with senior management) to discuss the position in more detail and confirm that we're a good fit for each other. Please return the completed test by Thursday, August 4th. * Any bid without work samples/links will be neglected. * You have to sign an NDA as well as a contact for 1 year.
Skills: Apache Hive Apache Spark Hadoop Scala
Hourly - Expert ($$$) - Est. Time: Less than 1 week, 30+ hrs/week - Posted
Hi, I'm looking for someone with experience writing complex Hive queries/scripts for large (multi-hundred terabyte) datasets. Should also be experienced in Java for writing Hive UDFs, Amazon Web Services (EC2), and having previously worked in Qubole is a plus (but not required). To be considered, please share the most challenging large-scale issue you ever faced with Hive, and how you overcame it (be sure to include the node count and approximate disk space required). Thank you.
Skills: Apache Hive Amazon Web Services Hadoop
Hourly - Intermediate ($$) - Est. Time: 3 to 6 months, 10-30 hrs/week - Posted
Our company has several projects related to Big Data & Hadoop in delivery pipeline. We are looking for solid resources with excellent skills on BigData & Hadoop (Spark, Hive, Pig, Kafka, Storm, Elastic Search, Solr, Python, Scala) for ongoing work with our clients in US and world wide. If you are interested, please send us links to your portfolio on Big Data & Hadoop and why should consider you for this role. Looking for a long term relationship. (See the attached job description for the dev role).
Skills: Apache Hive Apache Kafka Apache Solr Elasticsearch
Hourly - Expert ($$$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
Senior level SAS programmer – Customer Marketing, Loyalty, Data Management, Big Data/Hadoop experience Job Responsibilities: •Transform product transaction data into clearly defined and executable offer marketing strategies for merchant including targeting of offers based on previous product/category purchases (targeting matrix) •Create vendor specific and cross sponsor marketing segmentations for offer targeting •Work closely with econometricians to develop custom relevancy scoring models for all merchant offers •Design and implement test and learn strategies •Produce product offer reports including redemption rates and incremental sales vs. control groups •Identify and articulate customer patterns in a meaningful way to drive value to AMEX/Wellness + members •Perform as a marketing analytics subject matter expert within AMEX and with merchant •Create adhoc analysis to support merchant marketing needs including market analysis, share of wallet analysis, campaign analysis, test analysis etc. •Maintain open lines of communication with leadership, partners, and stake holders in order to continuously evolve and improve consumer insights and offer marketing strategies Required Skills/Qualifications: • Analysts will be experienced analytic professionals with background/degrees in statistics/quantitative studies • Proficiency in Linux scripting required. • Minimum 3 years of recent experience in an analytic role, preferably marketing analytics • Experience with Oracle & Big Data environments and knowledge of Teradata, Pig, Hive are strongly desired. • Proficiency in SAS/SQL programming required. • Big Data exposure is good but some experience would be a big plus • Candidate is willing to relocate to Phoenix, AZ • Preferably, no H1’s. If H1, the candidate will cover cost of H1 transfer unless AMEX/Plenti is willing to sponsoring. • must be technical/analytical • good problem solver • good communicator • good listener
Skills: Apache Hive Pig SAS