Hadoop Jobs

35 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Entry Level ($) - Est. Time: 3 to 6 months, 30+ hrs/week - Posted
I am looking for the person who is expert in Microsoft Power bi and have lots of work for him. I need urgent help. Need to create PowerBI Dashboard. Only apply if you have done work on Microsoft power bi earlier.
Skills: Hadoop Big Data
Hourly - Expert ($$$) - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
Need help with log gathering and kibana dashboard. Can you please share your contact details and best time to talk to you? Thanks,
Skills: Hadoop Amazon EC2 NoSQL PHP
Hourly - Entry Level ($) - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
Build automation scripts and frameworks to improve operational processes and procedures Deploy and document newer technologies for the potential deployment of services following a development and release life cycle Strong knowledge of deployment automation, orchestration and configuration management (Puppet, Ansible, etc ...) Experience with OpenStack, Docker, Kafka, Hadoop, Mesos, etc ... Ability to effectively communicate with various teams and levels of management DevOps with strong Linux systems administration experience Experience with Networking, Load balancing and firewalls a plus Experience with programming/scripting languages (Ruby, Python, perl etc.)
Skills: Hadoop Docker KVM OpenStack
Hourly - Entry Level ($) - Est. Time: 3 to 6 months, 30+ hrs/week - Posted
Hi - Looking for Hadoop hands on trainer. 1. Must be expert on hadoop ecosystem: hive, pig, hbase, sqoop, spark storm, flume, oozie, cassandra, mahout, R, solr. 2.Should be have experience in corporate world to deal with real issues and solutions. 3.Should be strong on Linux.
Skills: Hadoop Apache Hive Apache Spark NoSQL
Hourly - Intermediate ($$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
RefactorU is looking for one or more talented data scientists to join our growing team and help us develop and lead our Data Science Bootcamp. The Data Science Bootcamp is a selective, 12-week, hands-on, immersive bootcamp in Boulder, CO. Successful candidates will have a blend of both hands-on web development experience AND a passion for helping people learn and grow. We are looking for a developer with extremely strong communication / interpersonal skills who loves teaching. We are looking for candidates for Phase 1 and/or Phase 2 (see below): Day-To-Day Responsibilities: Phase 1: Leveraging an already developed outline, create lecture materials, exercises, solutions, project guidelines, and student resources for each day of the 12-week full-time immersive data science bootcamp. Candidates MUST have extensive knowledge and industry experience working with Python, Databases, SQL & NoSQL, Pre-processing, scikit-learn, Linear Regression, K-means Clustering, Gradient Descent, Logistic Regression, Model Evaluation, Decision Trees, SVM, PCA, Nearest Neighbors, Time Series, Spark, Perceptrons, Naive-Bayes, Data Visualization, Business Communication, A/B Testing, Bayesian Bandit, Neural Networks, Natural Language Processing, Ensemble Methods, Recommender Systems. Computer Vision is a plus. Strong interest and ability to teach and mentor. Phase 2: Deliver lectures, facilitate classroom activities, breakout groups, and in-class challenges. Serve as a coach and mentor for students in their exercises, project work, and job search. Work closely with other instructors, teaching assistants, and staff to create world-class learning experiences.
Skills: Hadoop Apache Spark Artificial Neural Networks Data Science
Hourly - Expert ($$$) - Est. Time: Less than 1 month, 30+ hrs/week - Posted
We are looking for a consultant who is expert in building ETL pipelines by extracting data from multiple source systems and write data into a relational database. We are looking for consultant who has expertise in the following areas: 1. Expertise in Building ETL flows using Apache Spark and Python, Should have minimum 3 to 5 years of experience in Spark , Spark SQL and python. 2. Expertise in setting up Apache spark cluster with multiple nodes. 3. Expertise in performance tuning of ETL jobs on Spark clusters. 4. Expertise in writing Spark SQL and Python scripts for extracting data from relational databases and transforming the source data and loading the data into redshift. 5. Expertise in writing SQL queries
Skills: Hadoop Apache Mahout Apache Solr Machine learning
Hourly - Entry Level ($) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Looking for Hadoop Hortonworks Administrator for training, support and enhancements.
Skills: Hadoop
Hourly - Intermediate ($$) - Est. Time: More than 6 months, 10-30 hrs/week - Posted
Hello, Opportunity to make money in part time as Hadoop Trainer. We are looking for independent Hadoop Trainers for Online and Corporate Training for Onsite and Domestic MNC clients.
Skills: Hadoop Apache Hive Big Data Data Visualization