Cloudera Jobs

9 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Intermediate ($$) - Est. Time: 3 to 6 months, Less than 10 hrs/week - Posted
I am looking for a team/solo developer who has knowledge to develop front end and integrate that with back end cloud infrastructure (we will be using open source cloud frameworks like XCP) This project is only a prototype so perfection is not needed. And I am open to any framework or language. Goal would be to login to the portal for our clients and access cloud resources (server, select memory, hard disk, software (this is something i.e. a freeware) and one click deploy) We have the infrastructure in place and our systems engineer can help out with specs.
Skills: Cloudera Apache CloudStack OpenStack Xen Cloud Platform
Fixed-Price - Intermediate ($$) - Est. Budget: $300 - Posted
Responsible for installing Hadoop infrastructure. Set up hadoop ecosystem projects such as Hive, Spark, Hbase, Hue, Oozie, Flume, Pig Setup new Hadoop users with kerbros/sentry security. Process for creation and removal of nodes. Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
Skills: Cloudera Apache Hive Apache Spark Hadoop
Hourly - Entry Level ($) - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
We need to setup a Hortonwork and Cloudera environment in our virtual machine and develop some demo scenarios for the following: - DistCp - Distributed Shell programs - HDFS command jobs - Generate files in HDFS for processing - HIve - Java Map-Reduce - Oozie - Pig - Spark - Sqoop - Streaming - Tajo We also need documentation and training for our staff members.
Skills: Cloudera Hadoop
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
We have a few hundred Gigs of data sitting on a few data nodes in Cloudera HDFS on AWS. We are in need of a senior developer that can run some basis statistics for us on these data files using any data discovery tool such as Tableau or Platfora or Qlik or Datameer. ... Requirement: * Strong past experience with using any data discovery tool on large data sets * Experience in creating visual reports in a hadoop setting * Ability to use such a tool in AWS and in calling Cloudera * Basis understanding of statistical reports such as aggregations
Skills: Cloudera Data Analytics Hadoop Tableau Software
Hourly - Entry Level ($) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Hi , I am having some client base in various regions and they are looking for various big data enthusiast who are experienced in various components such as Hadoop , Spark , Elasticsearch , Logstash , Kibana , Pig , Hive etc. Kindly shoot out your proposals and convenient time to discuss further with your skype ID Also write "Big Data enthu" on top and bottom of proposals so that I can come to know there are genuine proposals.
Skills: Cloudera Amazon EC2 Apache Flume Apache Hive