Hadoop Jobs

59 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Intermediate ($$) - Est. Budget: $30 - Posted
hadoop DB is required. plz see the file and bid it.... i need this by 30-04-2016 10 AM(as per IST)
Skills: Hadoop R-Hadoop
Hourly - Intermediate ($$) - Est. Time: 1 to 3 months, 10-30 hrs/week - Posted
We are looking for part-time help from a Big Data architect that has experience with Cloudera, Hive, Tableau, and AWS. Currently we have Tableau Desktop running on AWS that is trying to connect to Hive sitting on a small Cloudera cluster. We are trying to run some aggregations on a 5 GB CSV file that is sitting on HDFS within Cloudera, but the reports are taking a long time. We want this person to help us performance tune this issue for us asap. Qualifications: 2+ years of experience with HIve, Cloudera, Big Data architecture Good experience in connecting Tableau to data sources sitting on AWS Past experience in performance tuning issues such as the one explained above
Skills: Hadoop Amazon EC2 Amazon Web Services Big Data
Fixed Price Budget - Expert ($$$) - $400 to $500 - Posted
We require good and well experienced Hadoop admin female background who can provide training job support/take proxy calls in USA hours.
Skills: Hadoop
Hourly - Intermediate ($$) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
i am looking for someone who has processed huge volume of data using Apache spark. Job requires both streaming and batch expertise, data ingestion, ETL and some analytics
Skills: Hadoop Apache Spark
Hourly - Intermediate ($$) - Est. Time: 1 to 3 months, 10-30 hrs/week - Posted
Looking for developer with very good expertise in Scala, Hadoop, Semantic web Individual must have strong expertise in the above and and be able to code and review work done in Scala. ... Provide links of what you've done with Semantic web, hadoop and Scala. Provide github link. Please provide your skype id.
Skills: Hadoop Scala
Hourly - Expert ($$$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
The candidate will have to know the following environment: - Hortonworks Data Platform, with HDP™ 2.4: Ready for the enterprise Automated (with Ambari 2.2) including: >> Spark >> Hbase >> Yarn >> HDFS - Hortonworks Data Platform Add-Ons: >> Hortonworks Connector for Teradata v1.4.1 >> Hortonworks ODBC Driver for Apache Hive (v2.1.2) >> Hortonworks Data Platform Search (Apache Solr 5.2.1) - Apache Tika - Apache Kafka - Django Framework Project phase: Stage 1 - Data Integration with Kafka and Spark streaming Stage 2 - Feature Engineering and ML Pipeline with Spark and Hadoop HDFS Stage 3 - System DB with Apache HBASE A detailed technical documentation of the project requirements will be provided.
  • Number of freelancers needed: 3
Skills: Hadoop Apache Kafka Apache Spark HBase