Hbase Jobs

6 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Expert ($$$) - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
We are looking for someone to work 80 to 100 hours assisting our Sustaining Engineering team in fixing and improving a set of PIG scripts. Renaissance is a leading provider of Cloud solutions for k-12 schools and teachers. We are owned by Google. This work involves three activities: a.) Pig script debugging b.) Pig script development c.) Training other engineers on PIG and answering their questions.
Skills: HBase Big Data Hadoop Pig
Hourly - Intermediate ($$) - Est. Time: Less than 1 week, Less than 10 hrs/week - Posted
Trying to push data from oracle to hbase via sqoop.
Skills: HBase Oracle PL/SQL Sqoop
Hourly - Expert ($$$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
The candidate will have to know the following environment: - Hortonworks Data Platform, with HDP™ 2.4: Ready for the enterprise Automated (with Ambari 2.2) including: >> Spark >> Hbase >> Yarn >> HDFS - Hortonworks Data Platform Add-Ons: >> Hortonworks Connector for Teradata v1.4.1 >> Hortonworks ODBC Driver for Apache Hive (v2.1.2) >> Hortonworks Data Platform Search (Apache Solr 5.2.1) - Apache Tika - Apache Kafka - Django Framework Project phase: Stage 1 - Data Integration with Kafka and Spark streaming Stage 2 - Feature Engineering and ML Pipeline with Spark and Hadoop HDFS Stage 3 - System DB with Apache HBASE A detailed technical documentation of the project requirements will be provided. ... The candidate will have to know the following environment: - Hortonworks Data Platform, with HDP™ 2.4: Ready for the enterprise Automated (with Ambari 2.2) including: >> Spark >> Hbase >> Yarn >> HDFS - Hortonworks Data Platform Add-Ons: >> Hortonworks Connector for Teradata v1.4.1 >> Hortonworks ODBC Driver for Apache Hive (v2.1.2) >> Hortonworks Data Platform Search (Apache Solr 5.2.1) - Apache Tika - Apache Kafka - Django Framework Project phase: Stage 1 - Data Integration with Kafka and Spark streaming Stage 2 - Feature Engineering and ML Pipeline with Spark and Hadoop HDFS Stage 3 - System DB with Apache HBASE A detailed technical documentation of the project requirements will be provided.
Skills: HBase Apache Kafka Apache Spark Hadoop
Hourly - Expert ($$$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Skills Summary: Hadoop Expertise • HDP 2.0+ • HDFS • Hive • Oozie • Sqoop • Hbase • Good to have Knowledge of Ranger • Map Reduce programming Other Technical Capabilities • Puppet • Python • Shell Script
Skills: HBase Hadoop Linux System Administration MapReduce