Browse Hadoop job posts for project examples or post your job on Upwork for free!

Hadoop Job Cost Overview

Typical total cost of Upwork Hadoop projects based on completed and fixed-price jobs.

Upwork Hadoop Jobs Completed Quarterly

On average, 13 Hadoop projects are completed every quarter on Upwork.


Time to Complete Upwork Hadoop Jobs

Time needed to complete a Hadoop project on Upwork.

Average Hadoop Freelancer Feedback Score

Hadoop Upwork freelancers typically receive a client rating of 4.78.

Last updated: October 1, 2015

Popular Hadoop Searches

Clear all filters

Muhammad usman A. Agency Contractor

BI & DWH (Teradata,Qlikview,BO,Cognos,Microstrategy,Tableau,SSRS/SSIS)

Australia - Tests: 4

To provide best in class consulting services to the clients in Business Intelligence (BI) and Data-warehousing domain. My expertise is in following tools: Teradata V2R5/12 Microstrategy - MSTR (8, 9) Qlikview (8, 9) Cognos (7, 8.4) Tableau Business Objects (XIR2, XIR3) Microsoft SSAS / SSRS / SSIS SAS Teradata CRM and Teradata Miner Microsoft Dynamics CRM Pentaho Over the past 11 years, I have been working for three industry leader companies in BI and data-warehousing field. With these companies, I have worked on large scale Business Intelligence and data-warehousing projects in Europe. I have been working for different companies in Telecom, Banking, Finance, Insurance, Airline and Manufacturing industries. I can act as any of the following roles, End-to-End BI and data-warehouse Solution implementation expert, BI Project Concepting, BI Semantic layer Design, BI Reports/Dashboards/Scorecards design and development, Project Management, Business Analyst, BI KPIs Modeling, ETL development, Agile Scrum Master. Certifications: Certified Business Intelligence Professional (CBIP), Certified Cognos Modeler version 8/8.4, Business Objects Certified Professional (Level-2), Teradata Certified Master V2R5, Microstrategy Certified Engineer, ITIL Foundation certification, SAS Base certified, Scrum Master certified from Scrum Alliance.

Associated with: Bilytica private limited

$44.44 /hr
2,399 hours

Ravi Padmaraj

Ravi Padmaraj

Hadoop developer with exp in hadoop,web technologies

India - Tests: 2

* I am an hadoop developer with expereince in core java,hadoop, web frameworks, machine learning etc.I have a rich programming experience as well as financial analysis expertise. * Skilled in all hadoop ecosystem components:Mapreduce,Hbase,Hive,Pig,Spark,HCatalog * Skilled in web technologies :Spring,Struts,Servlets,JSP,EXTJS,Java Script,CSS * Expertise in retail,telecom and consumer electronics domain. * Expertise in building data analysis modules in all the above mentioned domains.Experience in R language. * Expertise in machine learning algorithms and frameworks like apache mahout. * Expertise in No SQL databases.(MongoDB,Cassandra,Neo4J,Titan) * Expertise in financial statement analysis and company research. * Certifications: Cloudera certified Developer Cloudera certified in Hbase Cloudera certified Admin IBM Big insights certified Big Data University Certifications

100% Job Success
$7.78 /hr
214 hours

Nag Y

Data Sciences, Enterprise application Predictive Modeling Professional

India - Tests: 2

Nag currenty drives the Analytics team. He has executed & delivered a few flagship projects in the areas of business intelligence & data mining across some of the large organizations including Alcatel Lucent, Aricent Software Systems, Coginzant . Nag has the experience of handling data-intensive assignments involving millions of customer base. He has led a few first-time-in-the-industry projects that required R&D and application of algorithms tailored to the domain. Specialties:Nag ’s competencies include: - A smart professional with 8+ years of experience in Enterprise applications. Hands on experience in developing multi tier architecture applications using Java, SOAP,Hibernate,Spring,Design Patterns ,OOAD,API design. Three years of experience in Big Data platform using Apache Hadoop and its ecosystem. Expertise in ingestion, storage, querying, processing and analysis of Big data. Experienced in using Pig, Hive, Sqoop, Oozie, Flume, HBase. Good experience with the Hive Query optimization and Performance tuning.  Hands on experience in writing Pig Latin Scripts and custom implementations using UDF'S. Good experience with Sqoop for importing data from different RDBMS to HDFS and export data back to RDBMS systems for ad-hoc Reporting.  Experienced in batch job workflow scheduling and monitoring tools like Oozie. Extended Hive and Pig core functionality by writing custom UDFs. Experienced in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java. Worked as technical team lead/off shore coordinator in leading multiple projects for timely deliverables. Comprehensive problem solving abilities, excellent verbal and written communication skills, willingness to learn, Team facilitator. Successfully plans and coordinates teams for multiple IT projects with managing time lines and budgets efficiently by analyzing, designing, and implementing cost-effective solutions Able to think "out of the box" and make critical decisions. Able to work effectively under pressure with changing priorities and deadlines. Provided direction for development, migration, and integration activities. Sees projects through from inception to deployment. Big Data analytics: storing and mining large & complex data - Customer scoring for various propensities: likelihood of churn, sale etc. - Product analytics: price elasticity of demand, market baskets - Web analytics: e-commerce & digital marketing optimization - Text mining: converting text into structured data, extracting semantics - Transforming academic research into industry applications - Building an analytics layer over existing ERP & CRM applications. Apache Spark jobs in Scala and MapReduce jobs in Java, Sqoop, Flume, Zookeeper, HBase and Hive . Hands on experience in installing, configuring and using ecosystem components like Hadoop MapReduce, HDFS, Hbase , Zookeeper, Oozie, Hive, Sqoop, Flume. Extensive experience in SQL and NoSQL development. Experience in data analytics using Apache Spark in Scala. Excellent Knowledge implementing Hadoop with the AWS EC2 system using a few instances in gathering and analyzing data log files. Working on data using Sqoop from HDFS to Relational Database Systems and vice-versa. Working on Hive/Hbase vs RDBMS, imported data to Hive, partitions, indexes, views, queries and reports for BI data analysis. Developing data pipeline using Flume, Sqoop, Pig and Java map reduce to ingest customer behavioral data and benefits histories into HDFS for analysis. Worked on RDDs and aggregate functions applied on them to get the required results. Working as a Implementation Analyst for configuring the benefits and events based on the BRD documents and processing them Web Development: Involved in various phases of Software Development Life Cycle (SDLC) as requirement gathering, data modeling, analysis, architecture design and development of project. Designed UI using JSP and HTML and validated with JavaScript for providing the user interface and communication between client and server. Worked on XML schema and SOAP message. Developed business logic activities by developing business service classes. Involved in web development using RoR technologies. Worked on different gems for authentication, building role base systems. Worked on Integration with other api and consuming their apis e.g. Twitter , LinkedIn, Facebook and Instagram Worked on integrating different payment system with the web applications e.g. PayPal, Authorize.ent, Heartland , Braintree etc. Experienced in deploying the Rails web app to VPS server e.g. Linode, Digital Ocean, Amazon EC2 , Rackspace etc with the apache passenger and ngnix server. Worked on writing the deployment scripts and cron schedulers. Worked on Redis and elastic search

$30.00 /hr
0 hours

Alexei Kashirin

Alexei Kashirin Agency Contractor

Distributed Computing

Israel - Tests: 12 - Portfolio: 4

I have been working mainly with PHP and Zend-Framework. My own project that is developed with PHP and Zend-Framework is at the users section on Follow the PHP capabilities I have moved to work mainly with Python and I have developed my own Python based Web-Framework and web-server that is fast and reliable which is mainly developed to work with Hypertable Database. The Python web-serving capabilities evaluated to the network's technical data-transfer limit. Python infrastructure includes, as well, FTP-server and DNS of capable to be set for any need like balanced IP addresses. Main interest is for Database Architecture and developing to operative state.

Associated with:, PhotoShock

100% Job Success
$33.33 /hr
3,110 hours

Sachin T.

Sachin T.

Sr. Java Hadoop Programmer / Analyst

United States - Tests: 2

SUMMARY: • 14+ years of experience involved in design, development, testing and support of high performance multi-threaded client server, web based applications and n tier applications in Java and C++. • 6+ years of exposure to Fixed Income, equities and derivatives with 3+ years of experience in credit risk, market risk, operational risk and FIX, CCAR reporting, VAR and stress testing. • 8+ years of server side experience using Core Java, multithreading, Spring, XML, JDBC, JMS, Web Services, JUnit, Mockito, J2EE, JSP, Servlet, HTML 5, JavaScript, jQuery and free marker templates. • 8+ years of experience in Sybase SQL, Oracle, MS SQL Server, shell, Perl, windows, Linux. • 8+ experience in requirements gathering, translating them to functional as well as technical specification. • 8+ experience in working with as well as leading offshore teams. • Exposure to big data technologies like Hadoop, pig, hive, spark, oozie and hbase. • Strong experience in various phases of Software Development Life Cycle (SDLC) Object Oriented Programming, OOD, UML concepts and J2EE Design Patterns and Test Driven Design • Hands on experience in Tomcat, WebSphere, WebLogic Application server, autosys and Microsoft Excel. • Excellent analytical, problem solving skills and good exposure to mathematics concepts in finance. • Ability to quickly master new concepts, applications, technologies, vendor products • Strong Communication and interpersonal skills with experience in Technical Writing and Documentation. • Strongly committed to deliver timely, accurate and quality work.

$30.00 /hr
0 hours

Sovmiya R

Sovmiya R

System Admin, Hadoop admin,Java Learner

India - Tests: 3 - Portfolio: 2

Having around 15 years of experience in IT industry, Involving Project Management, Team Management, Problem Management, Solution providing, System Administration, Network Administration, Systems Planning, co-ordination and Commissioning in IT Infrastructure. The core expertise area on New infrastructure implementation and maintenance of projects in different sector like telecom, data warehouse, financial industries. The core technical platform includes on different flavors of UNIX Administration, My responsibility included: > Installation and configuration of UNIX(Linux, AIX,) Servers > implementation of Hardware & software RAID (Ex: SAN/NAS, LVM, Cluster manager). > Technical account management for clients & providing trainings > Assistance to customers for product integration's and giving technical presentations. > Preparation of installation docket for the assigned installation projects. Principle certifications : Sun Certified System Administrator Solaris 10 - SCSA Sun Certified Network Administrator - SCNA Sun Certified Security Administrator – SCSA MYSQL Certified DBA. 5.0 Sun Certified Cluster Administration 3.2 Foundation Certificate on ITSM / ITIL Linux Administration(General)- Brain bench IBM Certified pSeries Support & AIX Administration – 5L Ver.5.2 Cisco Certified Network Associate (CCNA) Microsoft Certified Professional (MCP) Having Good knowledge of hadoop ecosystems in HDFS and Hadoop Cluster in background of Unix technology. Personally setup Hadoop environment and self practicing in Bigdata technologies. • Experienced in Installing apache hadoop and implementing hadoop ecopsystem project. • Hadoop cluster configuration & deployment to integrate with different systems hardware. • Commissioning and Decommissioning of Node. Managing hadoop Services like Namenode, Datanode, Jobtracker, Tasktracker. Configured core-site.xml, hdfs-site.xml, mapred-site.xml. • Working knowledge in Pig, Hive Mysql, Hbase • Experiened in loading data into the cluster from dynamically-generated files using Flume also from the local file system to the hadoop Cluster. • Troubleshooting, diagnosing, tuning, and solving Hadoop issues.

$5.00 /hr
0 hours