Hadoop Jobs

100 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Expert ($$$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
Qualifications for this Opportunity A successful Chief Software Architect will demonstrate the following qualifications: - Bachelors degree in Computer Science, Computer Engineering, or Electrical Engineering - Experience with Cloud architecture and DevOps - At least 8 years of experience in hands-on .NET programming and related technologies (MS Visual Studio, NHibernate, etc) - Proficient in TDD, NoSQL, Relational DBs (MSSQL, MySQL, Oracle), SOA, EDA, and design patterns - Hands-on with Hadoop, openstack, MVVC and AWS - Excellent understanding of current enterprise software technologies and development practices/tools, including virtual environments, source control, remote development, issue tracking, build and test automation, and networking management - Ability to engage sophisticated global customers in deep technical discussions, making them confident that you know their problem better than they do - Computer Science or Engineering graduate/ post-graduate/ doctorate with top marks from university - A true “roll up the sleeves and get it done” working approach; demonstrated success as a problem solver, operating as a result-oriented, self-starter - Comfort “working virtually” with teammates and customers around the world The Type of Chief Software Architects We’re Looking For We need technical genius who thrives in an entrepreneurial environment and in solving complex problems for customers with high expectations.
Hourly - Expert ($$$) - Est. Time: More than 6 months, 30+ hrs/week - Posted
• Creating solutions architecture, algorithms, and designs for solutions that scale to the customers enterprise/global requirements • Leading a small team of Software Engineers to apply software engineering practices and implement automations across all elements of solution delivery • Manage accountability of team members, with pinpoint focus on quality; Personally accountable for resolution of the most technically challenging issues • Ensuring our customers are supremely confident in the adv Required Education and Experience: · Bachelors degree in Computer Science, Computer Engineering, or Electrical Engineering · Experience with Cloud architecture and DevOps · At least 8 years of experience in hands-on Java programming and related technologies (eclipse, netbeans, hibernate, JPA, Spring, etc) · Proficient in TDD, NoSQL (MongoDB or Cassandra), Relational DBs (MSSQL, MySQL, Oracle ), SOA, EDA, and design patterns · Hands-on with Hadoop, openstack, MVVC and AWS · Excellent understanding of current enterprise software technologies and development practices/tools, including virtual environments, source control, remote development, issue tracking, build and test automation, and networking management · Ability to engage sophisticated global customers in deep technical discussions, making them confident that you know their problem better than they do · Computer Science or Engineering graduate/ post-graduate/ doctorate with top marks from a top technical university · A true “roll up the sleeves and get it done” working approach; demonstrated success as a problem solver, operating as a result-oriented, self-starter · Comfort “working virtually” with teammates and customers around the world.
Hourly - Intermediate ($$) - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
We are a new wordpress website www.axonconsulting.com.au in Australia. Our areas of specialization include Hadoop, SPARK, Big Data, Business Intelligence etc. We are seeking the services of an SEO expert to optimise our website for first page presence on all search engines within Australia only.
Hourly - Expert ($$$) - Est. Time: 1 to 3 months, 30+ hrs/week - Posted
Need a team of developers having experience in Hadoop,Tableau, Datamining
Fixed-Price - Entry Level ($) - Est. Budget: $30 - Posted
I am creating a course on Intro to Hadoop and Big Data. Subjects will include things like Hadoop, Pig, HDFS, etc. We will talk over Skype and you will help me develop an outline and slides for a 3-5 hour course on Hadoop and Big Data. ... We will talk over Skype and you will help me develop an outline and slides for a 3-5 hour course on Hadoop and Big Data. You will also need to be able to create examples, meaning put together example data, find example data sets, and write very small bits of example code to go along with some of the major lessons in the course.
Skills: Link Building Search Engine Optimization (SEO) SEO Backlinking
Fixed-Price - Expert ($$$) - Est. Budget: $300 - Posted
we are looking for someone to do ghost writing for my client....All relevant material will be provided by his SME...only the selected freelancer has to write a book of approx 400 pages.... persons with IT background with book writing experience will be given preference..... please share your previous work . Also Along with proposal, please share your Latest CV( mandatory for job).... We are also looking for experienced persons for writing book on " R for Data Sciences"....Number of pages will approx 400..... Payment will be $300 per book.... regards,
Skills: Hadoop Data Science Ghostwriting
Hourly - Expert ($$$) - Est. Time: Less than 1 week, 30+ hrs/week - Posted
Hi, I'm looking for someone with experience writing complex Hive queries/scripts for large (multi-hundred terabyte) datasets. Should also be experienced in Java for writing Hive UDFs, Amazon Web Services (EC2), and having previously worked in Qubole is a plus (but not required). To be considered, please share the most challenging large-scale issue you ever faced with Hive, and how you overcame it (be sure to include the node count and approximate disk space required). Thank you.
Skills: Hadoop Amazon Web Services Apache Hive
Fixed-Price - Intermediate ($$) - Est. Budget: $2,400 - Posted
Open source Big data platforms (Hadoop, Cloudera, Spark, Hortonworks, Cassandra etc.) 2. Hadoop Ecosystem – HDFS, Hive, Pig, NoSQL, Flume, HBase, Solr, Impala, Yarn, Hue, Sqoop, Oozie, MapR etc. 3. ... Data transformation involving encryption and decryption through RDBMS – Hadoop – RDBMS 5. MapReduce programming experience for a. sorting and consolidating data from multiple sources & formats to a normalized structure b. generating reports using complex calculations and aggregations at frequent time intervals 6.
Skills: Hadoop Big Data MapReduce
Hourly - Entry Level ($) - Est. Time: 1 to 3 months, 10-30 hrs/week - Posted
Hi, I am looking for some one to help me in Unix and Oracle SQLand indexes with my on going successfactors project. Please leave your phone number i will be in touch with you asap.