Hire the best Hadoop developers & Programmers

Check out Hadoop developers & Programmers with the skills you need for your next job.
Clients rate Hadoop developers & Programmers
Rating is 4.8 out of 5.
4.8/5
based on 266 client reviews
  • $40 hourly
    I am a developer focused on providing highly efficient software solutions. - Full Stack Developer - Data Scientist
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Apache Spark
    Cloudera
    CakePHP
    Apache HBase
    Apache Hadoop
    Laravel
    Python
    PHP
    MongoDB
    JavaScript
  • $38 hourly
    💡 If you want to turn data into actionable insights or planning to use 5 V's of big data or if you want to turn your idea into a complete web product... I can help. 👋 Hi. My name is Prashant and I'm a Computer Engineer. 💡 My true passion is creating robust, scalable, and cost-effective solutions using mainly Java, Open source technologies. 💡During the last 11 years, I have worked with, 💽Big Data______🔍Searching____☁️Cloud services 📍 Apache Spark_📍ElasticSearch_📍AWS EMR 📍 Hadoop______📍Logstash_____📍AWS S3 📍 HBase_______📍Kibana_______📍AWS EC2 📍 Hive_________📍Lucene______ 📍AWS RDS 📍 Impala_______📍Apache Solr__📍AWS ElasticSearch 📍 Flume_______📍Filebeat______📍AWS Lambda 📍 Sqoop_______📍Winlogbeat___📍AWS Redshift 5-step Approach 👣 Requirements Discussion + Prototyping + Visual Design + Backend Development + Support = Success! Usually, we customize that process depending on the project's needs and final goals. How to start? 🏁 Every product requires a clear roadmap and meaningful discussion to keep everything in check. But first, we need to understand your needs. Let’s talk! 💯 Working with me, you will receive a modern good looking application that will meet all guidelines with easy navigation, and of course, you will have unlimited revisions until you are 100% satisfied with the result. Keywords that you can use to find me: Java Developer, ElasticSearch Developer, Big Data Developer, Team lead for Big Data application, Corporate, IT, Tech, Technology.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Big Data
    ETL
    Data Visualization
    Amazon Web Services
    SQL
    Amazon EC2
    ETL Pipeline
    Data Integration
    Data Migration
    Logstash
    Apache Kafka
    Elasticsearch
    Apache Hadoop
    Apache Spark
    Core Java
  • $21 hourly
    As a Full-Stack Developer with 15 years of experience in related fields, I specialize in front-end development and Social Media Marketing including Amazon,Google Ads, Facebook, Instagram, eBay etc I'm highly motivated, goal-oriented, hands-on senior software engineer with extensive technical skills in hosting server deployment, malware removals, debug codes, design premium quality websites Wordpress, Shopify, Codeigniter, Bootstrap, Foundation, HTML5/CSS3/PHP5/MySQL, Stripe, PayPal, Intercom.io in web development. I've already worked with Very Large Projects, Startups and completed them successfully with a good Feedback scores and recommendations. Developed many applications from scratch and also fixed some messed up projects. I always like to accept new technological challenges. Specialized in social media marketing including Amazon, Facebook, Instagram, eBay and google. My skill sets include: Frontend skills: * Javascript, jQuery, AJAX, gulp.js, node.js, Apache Cordova (Formerly Phonegap); * HTML5; * CSS3; * AMP (accelerated mobile pages); * Adaptive layouts/frameworks - Bootstrap, Foundation, Email Templates; Backend skills: * PHP; * node.js; Databases: * Mysql; Knowledge of PHP Frameworks: * Laravel; * Codeigniter; * Zend Framework 2; * Cake PHP; Knowledge of PHP CMS: * Wordpress; * Magento; * Joomla; Social Media Management and Setup: * Google Ads, Google Tag Manager * Facebook, Instagram * Amazon A+, Ads, Content writing, eBay Graphics: * Photoshop; * Illustrator; * Gimp; * Fireworks; APIs: * Facebook Graph API; * Twitter API; * Social Media APIs * Intercom.io * Mandrill * AWS Payment Gateways: * PayPal/Braintree/PayPal Connect; * Stripe.js / Stripe Connect; * Autharize.Net; * Recurly.js; * Custom Bank Payment gateway Regards, Dharminder
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Ad Server
    Google
    PayPal
    Shopify Templates
    AngularJS
    Web Hosting
    Core PHP
    Liquid
    Design Enhancement
    Digital Marketing
    ETL
    Virtual Private Server
  • $120 hourly
    With over 12 years of experience, of which about 8yrs I have worked with different Bigdata technologies(Hadoop, Spark) and the remaining time I mostly worked on writing python scrappers, scripts, API services and also built iOS applications using Objective-C - Experience in building data pipelines to process Petabyte scale data and optimise them for cost and performance - Experience in fine tuning the Spark jobs to the most optimal level and thereby cutting down infrastructure costs by 50-80% - Experience with building Data lakes for major e-commerce and fintech companies - Worked at different startups throughout my career and highly adaptable to different working methodologies like Agile and Kanban
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Apache Spark
    Big Data
    Apache Hadoop
    PySpark
    Scala
    Python
  • $40 hourly
    With 6+ years of Software Engineering, AI development, ML Engineering, Web Development, Data Scientist, Scriptwriting, and Automation. Programming Languages: Java, Python, Clojure, JavaScript, Kotlin, HTML, CSS, SQL • AI & ML expertise: NLP, GAN, Recommendation Systems, Object Recognition • AI & ML tools: Pandas, TensorFlow, PyTorch, Numpy • Python: Django, Flask, Rest framework • JavaScript: TypeScript, React, Redux, Node.js • Kotlin: MVVM Design, Android Studio, Dependency Injection • Java: Spring, JavaFX • Other tools: Spark, Git, Linux, Bash I ‣ Don't give up until the client is satisfied ‣ Completed everything before the deadline. ‣ Respond quickly. ‣ Am Easy to communicate with. I am at your service, so feel free to contact me any time, I will be happy to help you out. Thanks & Regards Joy Longawis
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    DevOps
    UX & UI
    App Development
    Web Development
    Web Scraping
    API
    Artificial Intelligence
    JavaScript
    Neural Network
    Data Science
    AI Development
    Machine Learning
    Natural Language Processing
    Java
    Python
  • $10 hourly
    Technical Experience * Hands on experience in Hadoop Ecosystem including Hive, Sqoop, MapReduce and basics of Kafka * Excellent knowledge on Hadoop ecosystems such as HDFS , Resource Manager, NodeManager , Name Node, Data Node and Map Reduce programming paradigm * Expertise in managing big data processing using Apache Spark and its various components * Load and transform large sets of structured, semi-structured and unstructured data from Relational Database Systems to HDFS and vice-versa using Sqoop tool. * Data ingestion and refresh from RDBMS to HDFS using Apache Sqoop and processing data through Spark Core and Spark SQL * Proficiency in Scala and Pyspark required for high level data processing and have end-to end knowledge for implementation of a project * Designing and creating Hive external tables, using shared meta-store instead of Derby, and creating Partitions and Bucketing
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Amazon Web Services
    Visualization
    Apache Spark
    Apache Kafka
    Apache Hive
    SQL
    Apache Hadoop
  • $55 hourly
    My name is Aman, and I am from North India. I have a master's degree in the computer science domain.I have over 15 years of IT experience, including 7 years as a Senior Technical Architect in handling team onsite and offshore projects 📞 Invite me to your job if you would like a FREE 15-minute initial consultation! 📞 I believe in building products that are valuable for end-users and make a positive difference for the product owners. Therefore, I put a great emphasis on the product planning and design phases for the selection of accurate tools, technologies, and architecture. 🚀The areas of my tech expertise are: 🔶 Frontend development 🔶 Backend development 🔶 Application architecture 🔶 Devops 🔶 API integration 🔶 Custom API development 🔶 Head up your engineering team 🔶 Enterprise-level software applications development 🔶 Database Design and architect 🔶 Technical Documentation 🔶 Advance knowledge of CI/CD, Docker, and Jenkins 🔶 Good knowledge of Git, Gitlab, Jira, Trello, Monday.com, and Click up 🚀 Key Skills: ✔️ Strategic leadership entails creating and implementing technological strategies that are aligned with organisational goals and promote innovation. ✔️ Team Management entails leading and mentoring high-performing teams to build a culture of collaboration, creativity, and continual progress. ✔️ Technology Stack Mastery: Extensive knowledge of PHP/Laravel, Ruby on Rails, Java, Spring boot, Python, PHP/Laravel, MEAN Stack, MERN Stack, Angular, React js, Node js, Nest js, Redux, Next js, Nuxt js, Strapi, Wordpress, Big Data, Machine learning, AI, ChatGPT, RPA, Service now and Devops ✔️ Product Development: Successfully take items from concept to market while guaranteeing seamless integration of features and functionalities. ✔️ Agile Methodologies: Use Agile and DevOps approaches to improve development efficiency and time-to-market. Buzz words: React Native Developer, Website development, Website developer, JavaScript, PHP, Amazon Web Services, API, Database Architecture, API Integration, CTO, Custom PHP, Hybrid Mobile Application Developer, PostgreSQL, Web Programming, Web Application, CRM, Web Design, Laravel developer, PHP developer, Web developer, Web dev, PHP development, React.js dev. ROR dev, Python Dev, Frontend dev, Backend dev, Wordpress dev, .NET dev, React dev, Spree dev, E-commerce dev, SAAS app dev, ERP dev, Salesforce dev, Machine learning dev, AI dev, Chargpt dev, CMS dev, ios dev, Kotlin dev, Android dev, IOT dev, Flutter dev, Software architecture, AWS, Azure, Google cloud, Devops, and SEO, Feel free to message me to get more details about my experience relevant to your specific project. I am always open to new opportunities! 📞 Reach out by inviting me to your project right now! 📞 Looking forward to working with you on Amazing projects, Best regards, Aman- CTO/Full Stack Developer
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Fortune 500 Company
    Machine Learning
    Software Architecture & Design
    Technical Project Management
    Ruby on Rails
    Data Science
    Database Design
    Spring Boot
    Full-Stack Development
    DevOps
    PHP
    Core Java
    Software Development
    SaaS
    Chief Architect
  • $25 hourly
     Certification in Big Data/Hadoop Ecosystem  Big Data Environment: Google Cloud Platform, Cloudera, HortonWorks and AWS, SnowFlake, Databricks, DC/OS  Big Data Tools : Apache Hadoop, Apache Spark, Apache Kafka, Apache Nifi, Apache Cassandra, Yarn/Mesos, Oozie, Sqoop, Airflow, Glue, Athena, S3 Buckets, Lambda, Redshift, DynamoDB ,Delta Lake, Docker, GIT, Bash Scripts Jenkins, Postgres, MongoDB, Elastic Search, Kibana, Ignite, TiDB  Certification SQL Server, Database Development and Crystal Report.  SQL Server Tools: SQL Management Studio, BIDS, SSIS, SSAS and SSRS  BI/Dashboarding Tools: Power BI, Tableau, Kibana  Big Data Development Programing Languages: Scala and python. ======================================================================= ************************************* Big Data Engineer**********************************************  Hands on experience with Google cloud platform, Big Query, Google Data Studio and Flow  Developing ETL pipeline for SQL server as well using SSIS.  For Reporting and Analysis using SSIS, SSRS and SSAS cubes.  Having amazing experience with Big data framework and open source technologies (Apache Nifi, Kafka, Spark and Cassandra, HDFS, Hive Docker/Cassandra/ Postgres SQL, Git, Bash Scripts Jenkins, MongoDB, Elastic Search, Ignite, TiDB.  Managing data warehouse Big Data cluster services and developments of Data Flows.  Writing big data/Spark ETL applications for different sources (SQL, Oracle, CSV, XML,JSON) to support different department for analytics.  Extensive work with Hive, Hadoop, Spark, Docker, Apache Nifi  Supporting different department for big data analytics.  Build multiple end to end Fraud monitoring alert based systems.  Preferable language is Scala and python as well. ************Big Data Engineer– Fraud Management at VEON *************  Devolved ETL Pipeline from Kafka to Cassandra using Spark in Scala Language.  Using Big Data Tools with Horton Works and AWS (Apache Nifi, Kafka, Spark and Cassandra, Elastic Search)  Dashboard Developments - Tableau and Kibana.  Writing SQL server complex queries, procedures and Functions.  Developing ETL pipeline for SQL server as well using SSIS.  For Reporting and Analysis using SSIS, SSRS and SSAS cubes.  Developing and designing Auto Email Reports.  Offline Data Analytics for Fraud Detection and Setting up controls for prevention.  SQL Database Development.  System Support of Fraud Management.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Google Cloud Platform
    SQL Programming
    Data Warehousing
    Database
    AWS Glue
    PySpark
    MongoDB
    Python Script
    Docker
    Apache Hadoop
    Apache Spark
    Databricks Platform
    Apache Kafka
    Apache Hive
  • $110 hourly
    Distributed Computing: Apache Spark, Flink, Beam, Hadoop, Dask Cloud Computing: GCP (BigQuery, DataProc, GFS, Dataflow, Pub/Sub), AWS EMR/EC2 Containerization Tools: Docker, Kubernetes Databases: MongoDB, Postgres-XL, PostgreSQL Languages: Java, Python, C/C++
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    MapReduce
    Apache Kafka
    Cloud Computing
    Apache Hadoop
    White Paper Writing
    Academic Writing
    Google Cloud Platform
    Dask
    Apache Spark
    Research Paper Writing
    Apache Flink
    Kubernetes
    Python
    Java
  • $30 hourly
    Innovative, passionate, and quick software developer and architect with deep knowledge of programming concepts and internals. 15+ years of hands-on industry experience in programming, designing, managing, & leading software projects & companies. Change agent, problem solver with a passion for technology; skilled in grasping the big picture, conceptualizing, developing & implementing solutions by partnering closely with business leaders. Worked for different type of industries, which includes Health-care Informatics, Lab Informatics, MOOC, Online Education, ERPs, Electronic Design Automation (EDA), Semi Conductor, Heavy mechanical manufacturing, Travel, Pharmaceutical & e-commerce. Skilled in: Python | PHP | Apex | Objective C | Java | C/C++ | Flex | ActionScript | JavaScript | Perl | VB6 iPhone SDK | Android SDK | BlackBerry SDK | Flex SDK | Sancha Django | Django REST | Odoo | Web2Py | Zope | Plone | edX | CodeIgniter | .NET | Java EE SQL Server | MySQL | PostgreSQL | Sybase | DB2 | SQLite | MS Access Odoo | OpenERP | Salesforce CRM | MS SharePoint git | SVN | CVS | VSS | Unfuddle Scrum | UP | UML | CASE Tools | Poseidon | Rambaugh OMT ASP.NET | ASP | Telerik Visual C++| MFC | Win32 | COM | DCOM | OLE | Multithreading ANT | JDK | Digester| Struts | Servlets | JSP | EJB | WebSphere | Eclipse Web 2.0 | AJAX | CSS | XML | XSD OpenSource | github Knowledge Management | MediaWiki | PHP Health-care Informatics | ICD9 | ICD10 | HIPPA | HL7 SEMI Standards | EFEM | SEMI E87 | SEMI E90 | SEMI E40 | Robotics | SECS/GEM | SCADA Customs ERP | Landing Cost | HS Codes | WTO Standards Lab Informatics | Bika | OLiMS | LiMS | LIS Windows | MacOS | Linux | Ubuntu | Unix | EC2 Windows CE | Micro Controller Programming | Dynamic C
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Ehealth
    Mapbox
    Apache Hadoop
    LIMS
    Android
    iOS
    Odoo
    web2py
    Django
    Laravel
    Python
    PHP
    WordPress
    React
    JavaScript
  • $30 hourly
    With close to 10 years of industry experience in the specialised field of software design and development, I possess proven capabilities to develop high quality software applications. My aim is to obtain a challenging position that will utilise my skills and experiences and which will also provide me with the opportunity for growth and advancement. Languages- Java, Python, Javascript. Skills- Core: Data Structures and Algorithms. Data Analysis: Hadoop MapReduce Backend: Java, Spring, Spring Boot, Microservices, Struts, Design Principles, Design Patterns, SQL, Webservices, SOA(REST and SOAP), JMS, Servlets, Swing, JSP, MAVEN, Sub versioning(svn, git), Jenkins. Frontend: HTML5, CSS3, Javascript, Jquery, Bootstrap, React.js.  IDE/Tools - Atom, Notepad++, Brackets, Eclipse, NetBeans, Excel, RapidSQL, Squirrel, Pycharm.  Databases - Oracle, DB2, MySQL, PostgreSQL. Achievements-  Won INFOSYS' Quarterly Manufacturing Unit Level award for my outstanding performance in Quarter 4, 2010.  Won Royal Bank of Scotland's monthly awards for outstanding performance during the period Aug'14 and July'15. It is a certificate of recognition of commitment, hard work and continued contribution to the business.  Won Royal Bank of Scotland's Star Team of The Month award for supporting colleagues and making a positive contribution to the business. Projects- 1. User Interface Development 2. Enterprise Application Development 3. Website Development 4. Desktop Software Development 5. Peer-to-peer application development 6. Webservices English Exam(s)- Pearson Test of English(PTE) Academic - Overall Score - 76 with 90/90 in English Writing. IELTS General - Overall Score - Band 7 with 8.5 band in English Listening.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Big Data
    MapReduce
    API
    Database
    Spring Framework
    CSS
    Apache Tomcat
    Spring Boot
    Microservice
    Apache Hadoop
    Java
    Python
    JavaScript
  • $40 hourly
    Successful delivery of 10+ complex client-facing projects and exposure in the Telecom, Retail, Automobile, and Banking industries with a focus on data, analytics, and development of the right analytical and consulting skills to deliver in any challenging environment. Strong track record in Data Engineering with hands-on experience in successfully delivering challenging implementations I offer data services and implementation to set up Data Warehouses and Data solutions for analytics and development in retail, telecom, fintech, automobile, etc. I am a software and data developer. I earned a Bachelor's degree in computer science and have 10+ years of experience in Data Engineering and Cloud infrastructure. Tech Stack: * Snowflake (certified) * Teradata (certified) * Informatica (certified) * WhereScape RED * Airflow * AWS Athena and EC2 * Python, Pandas & Numpy * Data Warehousing (certified) * Data Scrapping, Data Mining * Data Modeling * Netezza, DB2 * Oracle PL\SQL * C# .NET * Automation * SQL & NoSQL databases
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    PDF Conversion
    Web Crawling
    Data Integration
    Data Vault
    Python
    Informatica
    API
    Snowflake
    Data Warehousing
    Database Management
    ETL Pipeline
    Apache Airflow
    MySQL
  • $35 hourly
    5+ years of experience in Big Dat Technologies like Spark, Hadoop, Hive, Sqoop, ADF, Databricks. 5+ years of experience in ELK Stack( Elasticsearch, Logstash and Kibana). Microsoft Azure Certified Data Engineer. Elasticsearch and Kibana Certified. MongoDB Certified Developer.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Microsoft Azure
    Databricks Platform
    Apache Spark
    PySpark
    MongoDB
    Logstash
    Elasticsearch
    Grok Framework
    ELK Stack
    Apache Hadoop
    Hive
    Bash
    SQL
    Kibana
  • $50 hourly
    Development experience in information management solutions, ETL processes, database design and storage systems; Responsible, able to work and solve problems independently. Software Developer, Integration process Architect Envion Software Creating a Hadoop cluster system to process heterogeneous data (ETL, Hadoop cluster, RDF/SparQL, NoSQL DB, IBM DashDB) ETL processes for big amount of database DataWarehouses creation and support Database Developer and Data Scientist A software development company Programming Analytics Stream processing Associate Professor Saint-Petersburg State University Member of the Database and Information Management Research Group
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Java
    DataTables
    Data Management
    Apache Spark
    Apache Hadoop
    Pentaho
    BigQuery
    Apache Airflow
    ETL Pipeline
    Python
    SQL
    Scala
    ETL
  • $100 hourly
    I have over 4 years of experience in Data Engineering (especially using Spark and pySpark to gain value from massive amounts of data). I worked with analysts and data scientists by conducting workshops on working in Hadoop/Spark and resolving their issues with big data ecosystem. I also have experience on Hadoop maintenace and building ETL, especially between Hadoop and Kafka. You can find my profile on stackoverflow (link in Portfolio section) - I help mostly in spark and pyspark tagged questions.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    MongoDB
    Data Warehousing
    Data Scraping
    ETL
    Data Visualization
    PySpark
    Python
    Data Migration
    Apache Airflow
    Apache Spark
    Apache Kafka
    Apache Hadoop
  • $30 hourly
    Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data Catalog
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    SQL
    AWS Glue
    PySpark
    Apache Cassandra
    ETL Pipeline
    Apache Hive
    Apache NiFi
    Apache Kafka
    Big Data
    Apache Hadoop
    Scala
    Apache Spark
  • $65 hourly
    A Full Stack Developer, experienced with Java, Javascript, Hadoop, C/C++, Solidity and Jasper Reports, Experienced with Solidity smart contracts and integrating DApps with different Blockchain networks. Also experienced with React and ExpressJS. Experienced with the Java language for Spring MVC and Big Data using Hadoop and Spark. Experienced with report writing using Jasper Studio.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Chatbot Development
    Dialogflow API
    Python
    ChatGPT
    API Development
    Hibernate
    Apache Hadoop
    Node.js
    React Native
    Solidity
    Java
    JavaScript
    React
  • $40 hourly
    🔍🚀 Welcome to a world of data-driven excellence! 🌐📊 Greetings, fellow professionals! I am thrilled to introduce myself as a dedicated Data Consultant / Engineer, leveraging years of honed expertise across a diverse spectrum of data stacks 🌍. My journey has been enriched by a wealth of experience, empowering me with a comprehensive skill set that spans Warehousing📦, ETL⚙, Analytics📈, and Cloud Services☁. Having earned the esteemed title of GCP Certified Professional Data Engineer 🛠, I am your partner in navigating the complex data landscape. My mission is to unearth actionable insights from raw data, shaping it into a strategic asset that fuels growth and innovation. With a deep-rooted passion for transforming data into valuable solutions, I am committed to crafting intelligent strategies that empower businesses to flourish. Let's embark on a collaborative journey to unlock the full potential of your data. Whether it's architecting robust data pipelines ⛓, optimizing storage solutions 🗃, or designing analytics frameworks 📊, I am dedicated to delivering excellence that transcends expectations. Reach out to me, and together, let's sculpt a future where data powers success. Thanks!
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    PySpark
    Machine Learning
    Natural Language Processing
    Informatica
    Data Science
    Data Warehousing
    Snowflake
    Data Analysis
    Big Data
    BigQuery
    ETL
    Apache Airflow
    Apache Hadoop
    Apache Spark
    Databricks Platform
    Python
    Apache Hive
  • $80 hourly
    A Backend Software Engineering with more than 6 years of experience. Have worked with large-scale backend/distributed systems and big data systems. A DevOps engineer with 4 years of experience - both on-premises and AWS, experienced with K8s, Terraform, Ansible, CI/CD. Currently working as Principal Engineer/ Solution Architect role.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Architectural Design
    GraphQL
    Serverless Computing
    Amazon Web Services
    DevOps
    API Development
    Elasticsearch
    Apache Kafka
    Scala
    Apache Spark
    Docker
    Apache Hadoop
    Kubernetes
  • $35 hourly
    I am a data engineer expert with over than 5 years experience in data ingestion, integration and manipulation. Till date, I have done many projects in data engineering and big data. I worked on business analytics and telco analytics, i used multy data platforms and framework such as Cloudera data platform, Nifi, R studio, Spark, Hadoop, Kafka ... If this is what you want, then get in touch with me
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Cloud Engineering
    Cloudera
    Apache Hadoop
    Data Warehousing
    Apache NiFi
    Linux
    Apache Spark
    Data Lake
    Data Analysis
    SQL
    Big Data
    Business Intelligence
    Scala
    Apache Hive
    Python
  • $48 hourly
    With 51 jobs completed, $200K earned, and a stellar 4.8/5 rating on Upwork, I bring a significant value proposition to your project. My experience represents countless hours spent mastering skills and solving complex problems, ensuring you don't have to navigate these challenges yourself. Hire me if you: ✅ Want a SWE with strong technical skills ✅ Need a Go, Rust, Python, or a Scala developer ✅ Want someone to technically lead a team of 10+ developers easily ✅ Desire a detail-oriented person who asks questions and figures out things on his own ✅ Even have a requirement in your mind but are not able to craft it into a technical format ✅ Want advice on what tools or tech you want to implement in your next big project ✅ Are stuck in a data modeling problem and need a solution architect ✅ Want to optimize a data pipeline ✅ Seek to leverage AI for predictive analytics, enhancing data-driven decision-making ✅ Require AI-based optimization of existing software for efficiency and scalability ✅ Wish to integrate AI and machine learning models to automate tasks and processes ✅ Need expert guidance in selecting and implementing the right AI technologies for your project Don't hire me if you: ❌ Have a project that needs to be done on a very tiny budget ❌ Require work in any language other than Go, Rust, Python, or Scala About me: ⭐️ A data engineer with proven experience in designing and implementing big data solutions ⭐️ A Go developer specialized in creating microservices ⭐️ Will optimize your code in every single commit without even mentioning or charging extra hours ⭐️ Diverse experience with start-ups and enterprises taught me how to work under pressure yet work professionally ⭐️ Skilled in integrating AI technologies to solve complex problems, improve efficiency, and innovate within projects
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Web Scraping
    Microservice
    ETL Pipeline
    Big Data
    Apache Spark
    AI Bot
    OpenAI API
    Artificial Intelligence
    Generative AI
    Large Language Model
    Golang
    Python
  • $40 hourly
    Hi, I am Isha Taneja, highly skilled in Data Analytics, Engineering & Cloud Computing from Mohali, India. I am an expert in creating an ETL data flow in Talend Studio, Databricks & Python using best design patterns and practices to integrate data from multiple data sources.  I have worked on multiple projects which require Data migration Data Warehousing development and API Integration. Expertise: 1. Migration - Platform Migration - Legacy ETL to Modern Data Pipeline / Talend / ERP Migration / CRM Migration - Data Migration - Salesforce migration / Hubspot Migration / Cloud Migration / ERP Migration 2. Data Analytics - Data Lake Consulting - Data Warehouse Consulting - Data Modelling / Data Integration / Data Governance / ETL - Data Strategy - Data Compliance / Data Deduplication / Data Reconciliation / Customized Data - Processing Framework / Data Streaming / API implementation / Data Ops - Business Intelligence - Digital marketing Analysis / E-commerce Analytics / ERP Reporting Capabilities - Big Data - Lakehouse Implementation 3. Software QA & Testing 4. Custom Application Development - UI/UX - Frontend Development - Backend Development 5. Cloud - Cloud-native Services/ AWS Consulting / Cloud Migration /Azure Consulting / Databricks /Salesforce 6. Business process automation - Bi-directional sync between applications / RPA A Data Professional and a ETL Developer with 10+ years of experience working with enterprises/clients globally to define their implementation approach with the right Data platform strategy, Data Analytics, and business intelligence solutions. My domain expertise lies in E-Commerce, Healthcare, HR Related, Media & Advertising, Digital Marketing You have the data? Great !! I can help you analyze it using Python. It involves performing exploratory data analysis, hypothesis testing, and data visualization. You have Big Data? Even Better !! I can help you clean, transform, store and analyze it using big data technologies and production it using cloud services like AWS and Azure Cloud. You want to track business KPIs and metrics? No Problem !! I can even help you develop reports using Tableau and PowerBI; this will always keep you ahead in your business. Specialities: Databases: Snowflakes, Postgres, Dynamo DB, Graph DB - Neo4j, Mongo DB, Data Warehouse concepts, MSSQL ETL-Tools: Talend Data Integration Suite, Matillion, Informatica, Databricks API Integration - Salesforce / Google Adwords / Google Analytics / Marketo / Amazon MWS - Seller Central / Shopify / Hubspot / FreshDesk / Xero Programming: Java, SQL, HTML, Unix, Python, Node JS, React JS Reporting Tools: Yellowfin BI, Tableau, Power BI, SAP BO, Sisense, Google Data Studio AWS Platform: S3, AWS Lambda, AWS Batch, ECS, EC2, Athena, AWS Glue, AWS Step Functions Azure Cloud Platform. Other Tools: Airflow Expect Integrity, Excellent communication in English, technical proficiency, and long-term support.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Databricks MLflow
    Databricks Platform
    Tableau
    Microsoft Power BI
    Data Extraction
    Talend Data Integration
    Data Analysis
    Microsoft Azure
    Continuous Integration
    AWS Lambda
    API
    Database
    Python
    SQL
    ETL
  • $25 hourly
    Greetings! I'm Akhtar, a seasoned Python Solution Architect and Full-Stack Developer with over 7 years of expertise. My skillset includes Python-backed architectures, sophisticated web development in Javascript & Typescript, cloud mastery with AWS, and comprehensive full-stack development using both MEAN (MongoDB, Express.js, Angular, Node.js) and MERN (MongoDB, Express.js, React, Node.js) stacks. Why Me? ✅ Technical Expertise: Advanced proficiency in Python (Django, Flask, FastAPI), JavaScript/TypeScript, MEAN (MongoDB, Express.js, Angular, Node.js) & MERN (MongoDB, Express.js, React, Node.js) stacks, and AWS cloud services ✅ Full-Stack Development: Capable of delivering dynamic, responsive web applications from concept to deployment ✅ Cloud Mastery & Architectural Prowess: Skilled in serverless architectures, containerization, and designing scalable systems ✅ Securing Applications & DevOps Efficiency: Emphasizing security best practices and seamless development with CI/CD pipelines 🥇 Differentiating Value Proposition: ➤ Full-Stack Development: Mastery of both backend and frontend technologies enables me to deliver complete web applications from conception to deployment, ensuring consistency and high performance across the MEAN and MERN stacks. ➤ Holistic Approach: From conceptualizing an idea in Python to integrating frontend intricacies using Javascript/Typescript and full-stack capabilities with MEAN and MERN ➤ Cloud-Centric: Expertise in leveraging the power of cloud platforms to provide scalable and cost-effective solutions ➤ Performance-centric Solutions: Ensuring optimized architectures for swift response times and efficient operations ➤ Rigorous Quality Assurance: Implementing thorough testing strategies for impeccable deliverables 🤝 Effective Collaboration: I firmly believe that open communication and mutual respect form the bedrock of successful projects. Understanding your vision and goals while maintaining transparency is my utmost priority. 💡 Your Vision, My Blueprint: Whether you're migrating to the cloud, crafting a new digital solution, or optimizing existing architectures, I'm here to translate your aspirations into tangible digital solutions. Let's connect now for a dynamic and efficient digital solution tailored to your needs!
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Web Development
    Mobile Development Framework
    Amazon Web Services
    Next.js
    TypeScript
    Cloud Computing
    Python
    GraphQL
    JavaScript
    AWS Lambda
    API Integration
    Microsoft Azure
    Progressive Web App
    API Development
    NestJS
    MongoDB
    Node.js
    React
    ETL
  • $50 hourly
    I am a Machine Learning Engineer with over five years experience of building and scaling AI powered products. ✅ HIGHLIGHTS: Top Rated & 100% Job Success. I have extensive experience in Natural Language Processing and Document Analysis. I have worked on high-profile projects for Amazon, KPMG, and McDonalds, among others. I can help you with every aspect of the technology stack while providing the following: - Extensive experience in Production ML - Outstanding communication with responsiveness throughout our engagement - Endless focus on overdelivering and making your project successful My core expertise: - Python, Keras + PyTorch, Schikit-Learn, OpenAI, LangChain, Haystack, and Transformers - AWS Sagemaker, Streamlit, and Flask - Software Architecture and System Design
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Large Language Model
    Artificial Intelligence
    Data Scraping
    ChatGPT
    GPT-3
    Streamlit
    Sentiment Analysis
    Python
    Data Science
    Natural Language Processing
    Deep Learning
    Decision Tree
    Python Scikit-Learn
    Machine Learning
  • $40 hourly
    Experienced AWS certified Data Engineer. Currently have around 4 years of Experience in Big Data and tools. AWS | GCP Hadoop | HDFS | Hive | SQOOP Apache Airflow | Apache Spark | Apache Kafka | Apache NIFI | Apache Iceberg Python | BaSH | SQL | PySpark | Scala | Delta Lake Datastage | Git | Jenkins | Snaplogic | Snowflake.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Amazon API Gateway
    Apache Spark
    Google Cloud Platform
    Apache Kafka
    Apache Airflow
    Big Data
    Data Migration
    Apache NiFi
    Amazon Redshift
    Amazon Web Services
    PySpark
    AWS Lambda
    AWS Glue
    ETL
    Python
    SQL
  • $150 hourly
    A full-stack engineer with a background that lends itself to helping companies stay lean and connected whilst scaling up their customers and services. With 7 year's experience in providing DevOps solutions and services to Finance, Web3.0 and Data Analytics - I have been heavily involved in building scalable platforms for microservices on Kubernetes, migrating infrastructure and services to the cloud and creating build environments for growing teams of developers. Skills: Cloud migrations - AWS, Azure, GCP, DigitalOcean, on-premise, Hetzner Container orchestration - Kubernetes (k8s) Rancher, Docker Swarm, OpenShift Infra-as-code - Pulumi, Terraform, CloudFormation, Sceptre Continuous delivery/integration - Jenkins, DroneIO, Helm, Kubernetes, GoCD, Tilt, Earthly Database - Elasticsearch, MongoDB, MySQL, MSSSQL, Postgres Applications - Docker, Nginx, LAMP, CoreOS, Terraform, Tableau, MS Exchange, Nutanix, VMWare Horizon/vCenter, Kafka, Atlassian Jira/Confluence, Microsoft SQL Server, Microsoft Exchange CloudFormation, Hugo Networking: DNS, DHCP, VLANs, NAT, Cisco Switch/Firewall Languages: Strong - PowerShell, Bash, Python, YAML, JSON Intermediate - GoLang, NodeJS, JavaScript, HTML, CSS, C#, TSQL Basic - Haskel, OCaml, Rust Achievements (in the last 2 years): - Re-engineered SAAS architecture - migrating all production to microservices on Kubernetes reducing the company's total software expenditure by 40% - Developed Terraform templates to make an automated multi-cloud disaster recovery solution - Implemented build pipelines to allow developers to work with isolated and identical versions of dev, test, and prod - Product Owner and Scrum Master of an Agile software development project for iPhone app - Advocate the need for a transparent business vision by employing OKRs and helped to align cascading team OKRs down through the organisation - Automated the provisioning of on-premise Kubernetes clusters and build pipelines using Matchbox, Bash and Helm templating Qualifications and education: 2020 - Kubernetes Certified Applications Developer 2020 - Kubernetes Certified Administrator 2018 - AWS Certified Developer Associate 2018 - Agile Certified Practitioner 2008 - 1st class degree in Electronic Engineering and Cybernetics Please get in touch to if you think my background can be helpful to you.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Apache Spark
    Grafana
    Amazon ECS
    Kubernetes
    Docker Compose
    Amazon ECS for Kubernetes
    Continuous Integration
    Docker
    Jenkins
    Amazon Web Services
    DevOps
    Terraform
    Microsoft Azure
  • $30 hourly
    👋 Welcome to my profile! I'm a Senior Java Full Stack Developer with strong hands-on experience in crafting robust, scalable, and efficient software solutions. My expertise spans across various technologies and domains, enabling me to deliver high-quality results for my clients. 💻 Front-end Development: I excel in building intuitive and engaging user interfaces using cutting-edge technologies such as React and Angular. From responsive designs to interactive components, I ensure seamless user experiences that enhance usability and drive engagement. 🌐 API Development: Proficient in designing and developing RESTful APIs using Java and Spring Framework, ensuring efficient communication between front-end and back-end systems. I prioritize clean, maintainable code and adhere to RESTful principles for scalability and interoperability. 🔍 Tech Stacks: Beyond the core technologies, I bring expertise in various tech stacks including: - Backend Technologies: Java, Spring Framework (Spring Boot, Spring MVC, Spring Data), Hibernate, JPA, JSF, J2EE - Database Technologies: MongoDB, MySQL, PostgreSQL - Cloud Services: AWS (Amazon Web Services), GCP (Google Cloud Platform) - Version Control: Git, GitFlow, GitLab, GitHub, SVN - Microservices Architecture: Kubernetes, Docker Swarm - Frontend Frameworks: React, Angular JS, Angular2+, Vue.js, Bootstrap 🛢️ Database Management: Proficient in designing and optimizing database schemas using Hibernate, ensuring data integrity, performance, and scalability. Whether it's SQL or NoSQL databases, I have the skills to handle complex data structures and queries efficiently. 🧪 Testing: Quality is paramount in software development, and I'm well-versed in various testing methodologies and frameworks such as JUnit and Selenium. I rigorously test my code to ensure it meets the highest standards of reliability and functionality. 🔧 DevOps: I believe in streamlining the development process through automation and continuous integration/continuous deployment (CI/CD) pipelines. With expertise in tools like Jenkins and Docker, I facilitate seamless collaboration between development and operations teams, leading to faster delivery cycles and improved deployment reliability. 🤖 AI Integration: As technology evolves, so do the possibilities. I have experience integrating AI solutions into applications, leveraging machine learning algorithms and frameworks to enhance functionality and provide intelligent insights. My skill set includes: - Java - Spring Framework (Spring Boot, Spring MVC, Spring Data) - Hibernate - React - Angular - API Development - AI integration - Front-end Development - Database Management - Testing (JUnit, Selenium) - DevOps (Jenkins, Docker) - AWS (Amazon Web Services) - GCP (Google Cloud Platform) - Version Control (Git, GitFlow, GitLab, GitHub, SVN) I am dedicated to delivering top-notch solutions that meet your business objectives while adhering to industry best practices and standards. Let's collaborate and bring your ideas to life !!
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Microservice
    Apache JMeter
    MongoDB
    SQL
    J2EE
    AWS Amplify
    AWS Lambda
    JUnit
    DevOps
    Angular
    AngularJS
    React
    Hibernate
    Spring Framework
    Java
  • Want to browse more freelancers?
    Sign up

How it works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

Hadoop Developers Hiring FAQs

What is a Hadoop developer?

Hadoop developers are responsible for developing and coding applications in the Hadoop open-source framework, which is primarily focused on handling big data for companies.

How do you hire a Hadoop developer?

You can source Hadoop developer talent on Upwork by following these three steps:

  1. Write a project description. You’ll want to determine your scope of work and the skills and requirements you are looking for in a Hadoop developer.
  2. Post it on Upwork. Once you’ve written a project description, post it to Upwork. Simply follow the prompts to help you input the information you collected to scope out your project.
  3. Shortlist and interview Hadoop developers. Once the proposals start coming in, create a shortlist of the professionals you want to interview. 

Of these three steps, your project description is where you will determine your scope of work and the specific type of Hadoop developer you need to complete your project. 

How much does it cost to hire a Hadoop developer?

Rates can vary due to many factors, including expertise and experience, location, and market conditions.

  • An experienced Hadoop developer may command higher fees but also work faster, have more-specialized areas of expertise, and deliver higher-quality work.
  • A contractor who is still in the process of building a client base may price their Hadoop developer services more competitively. 

How do you write a Hadoop developer job post?

Your job post is your chance to describe your project scope, budget, and talent needs. Although you don’t need a full job description as you would when hiring an employee, aim to provide enough detail for a contractor to know if they’re the right fit for the project.

Job post title

Create a simple title that describes exactly what you’re looking for. The idea is to target the keywords that your ideal candidate is likely to type into a job search bar to find your project. Here are some sample Hadoop developer job post titles:

  • Apache Hadoop developer needed to program data storage system for finance company
  • Java programmer to create scheduling system using Hadoop framework

Project description

An effective Hadoop developer job post should include: 

  • Scope of work: From programming in Apache to understanding Big Data concepts, list all the deliverables you’ll need. 
  • Project length: Your job post should indicate whether this is a smaller or larger project. 
  • Background: If you prefer experience with certain industries, platforms, or sizes, mention this here. 
  • Budget: Set a budget and note your preference for hourly rates vs. fixed-price contracts.

Hadoop developer job responsibilities

Here are some examples of Hadoop developer job responsibilities:

  • Create high-performing, scalable web services for the purpose of data tracking
  • Pre-processing responsibilities using Hive and Pig
  • Develop and implement best practices and standards

Hadoop developer job requirements and qualifications

Be sure to include any requirements and qualifications you’re looking for in a Hadoop developer. Here are some examples:

  • Knowledge and experience in Hadoop
  • Excellent knowledge of back-end programming in Java, JS, Node.js and OOAD
  • Excellent understanding of database structures, principles and practices
  • Problem solving skills related to managing Big Data
View less
Hadoop developers & Programmer Hiring Resources
Learn about cost factors Hire talent
Schedule a call