Hbase Jobs

8 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Expert ($$$) - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
We are looking for someone to work 80 to 100 hours assisting our Sustaining Engineering team in fixing and improving a set of PIG scripts. Renaissance is a leading provider of Cloud solutions for k-12 schools and teachers. We are owned by Google. This work involves three activities: a.) Pig script debugging b.) Pig script development c.) Training other engineers on PIG and answering their questions.
Skills: HBase Big Data Hadoop Pig
Hourly - Intermediate ($$) - Est. Time: Less than 1 week, Less than 10 hrs/week - Posted
Trying to push data from oracle to hbase via sqoop.
Skills: HBase Oracle PL/SQL Sqoop
Fixed-Price - Expert ($$$) - Est. Budget: $200 - Posted
The hadoop tools I need help with include map reduce, pig, sqoop, hbase and hive. The person needs to have both hadoop develpment as well as hadoop administration experience.
Skills: HBase Apache Hive Hadoop Pig
Fixed-Price - Intermediate ($$) - Est. Budget: $500 - Posted
1) Review cruise.com 2) Develop method to conduct a daily scrape of cruise.com's complete search results 3) Must be able to capture all cruises for sale (all locations, all ports, all cruise lines, all ships, all months and all cruise lengths 4) Must capture Cruise.com cruise name, cruise line, cruise ship, ports of call (individually), departure dates, date in each port, price levels, prices, review stars and bonus offers 5) Must time stamp each collection with a time/date stamp 6) Must be able to store information in a database that can be accessed on Google or AWS or similar. 7) Must be able to determine if scrape was successful or failed 8) If scrape failed, must be able to rotate to a new Proxy / User Agent and re-try at random time interval but within the same day 9) Must consider that in future additional/similar data will be scraped from other similar sites and will need to be merged/consolidated with identifier of source This information will eventually be used to power cruise website that makes use of the daily data to determine normal price fluctuations and when the best day to purchase
Skills: HBase Apache Cassandra
Hourly - Expert ($$$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Skills Summary: Hadoop Expertise • HDP 2.0+ • HDFS • Hive • Oozie • Sqoop • Hbase • Good to have Knowledge of Ranger • Map Reduce programming Other Technical Capabilities • Puppet • Python • Shell Script
Skills: HBase Hadoop Linux System Administration MapReduce