Hire the best Hadoop Developers & Programmers in the United Kingdom

Check out Hadoop Developers & Programmers in the United Kingdom with the skills you need for your next job.
  • $40 hourly
    I am a seasoned Cloud and DevOps Engineer with over 10 years of experience in architecting, deploying, and managing robust cloud infrastructure and development operations. My career includes substantial work with leading technology firms and diverse clients. My approach is centred around providing holistic solutions that encompass every layer of cloud and DevOps practices. What I bring to your project: • Deep expertise in cloud infrastructure and DevOps: I offer comprehensive knowledge and practical experience in building and optimising cloud-based solutions, ensuring they are scalable, secure, and efficient. • Exceptional communication and responsiveness: Throughout our collaboration, you can expect top-tier communication and a steadfast commitment to meeting your project's needs and timelines. • Dedication to excellence: My primary goal is always to exceed expectations and contribute significantly to the success of your project through meticulous and forward-thinking strategies. My core specialisations include: • Cloud Technologies: Proficient in AWS, Azure, and Google Cloud Platform, with a focus on leveraging their full potential to create agile and scalable environments. • DevOps Tools and Practices: Extensive experience with automation and orchestration tools like Ansible, Terraform, and Kubernetes. Skilled in CI/CD pipeline integration, infrastructure as code (IaC), and microservices architecture. • System Architecture and Design: Expertise in designing and implementing robust, high-availability systems tailored to specific business needs.
    Featured Skill Hadoop
    Azure DevOps
    Apache Hadoop
    SQL
    Microsoft Message Queue Server
    RabbitMQ
    DevOps
    Linux
    Bash
    Python
    Ansible
    Jenkins
    Amazon Web Services
    Java
    Docker
    Kubernetes
  • $120 hourly
    World-class expert in clouds, storages, data platforms, and high-performance systems, having led teams and core projects at AWS S3 and Apple. US Inventor. Lately, I have been focusing primarily on Clouds, Data Platforms & Data Governance, but fluent in the entire big data stack, including DevOps methologies, high-performance JVM installations (incl Spring), databases and Data Science. I emphasize reliability, clearly written processes, and agile turnaround with all stakeholders. I am now bootstrapping my own B2B startup, where I handle CustDev, Product Management, GTM, complex UX, enterprise search with AI and everything else. I am not that proficient in these areas, but I am learning something new every day. As a hedge against startup turbulence, I am happy to share my core competencies with companies through Upwork!
    Featured Skill Hadoop
    Artificial Intelligence
    Cloud Architecture
    Big Data
    Data Analysis
    Apache Hadoop
    Amazon S3
    Distributed Computing
    AI Bot
    Python
    Apache Spark
    Web Application
    Java
    Amazon Web Services
  • $40 hourly
    I am an Accountant, Financial Modeller, and Automation Expert for Windows & MAC (Excel & Access, Outlook, PowerPoint) primarily using VBA, SQL, Google Sheets, Slides & Docs, and Python. I have produced hundreds of automated spreadsheets, databases, and IT solutions for clients over the years since 2005. I've been a professional Excel Financial Modeller and VBA Developer since 2005, working with numerous accountants and actuaries to produce mainly VBA automation tools and integrated financial models. I have also written a number of business plans for start-ups over the years and secured funding for them. In 2009 I produced my own Excel-based financial forecasting software (FD4Cast) which allows users to quickly build dynamic integrated financial models driven by VBA, and which got ICAEW approval as an accredited software platform. Since then I have built many dynamic, customised models for the Construction, Healthcare, FMCG, Automotive, Entertainment, and Financial sectors, but also completed many varied (non-financial) automation solutions for clients where the needs have been complex. I work well under minimal supervision and I always provide excellent support for previous projects where needs may have changed and modifications required.
    Featured Skill Hadoop
    Apache Hadoop
    Accounting
    Hive Technology
    Financial Modeling
    MySQL Programming
    Accounting Software
    MySQL
    Microsoft Excel
    SQL
    Scala
    Big Data
    R
  • $50 hourly
    As a seasoned Data Scientist and Technical Product Manager, I bring extensive experience in Financial Crime Risk and Credit Risk management, coupled with deep proficiency in Python, Spark, SAS (Base, EG, and DI Studio), Hadoop, and SQL. Transitioning into freelancing, I am eager to leverage my skills to contribute to diverse projects. While Upwork's guidelines restrict sharing direct links to external profiles, I am happy to provide a detailed portfolio from my LinkedIn upon request.
    Featured Skill Hadoop
    Data Mining
    Big Data
    Data Science
    Fraud Detection
    Data Analysis
    PySpark
    SAS
    Credit Scoring
    Apache Hadoop
    SQL
    Python
  • $100 hourly
    I have over 15 years experience working with companies in a variety of industries from small startups through to Fortune 500 companies. From the first website build, through adding interactivity, rebranding and ultimately developing a multinational prominence, I have led my clients through a range of transitions. Throughout, I have kept abreast of best practices and trends to be able to best advise on the right tools and technologies to fit their needs. Throughout my career I have endeavoured to be tech agnostic. While I have served as Microsoft subject matter expert for Deloitte and have extensive knowledge of Azure and Windows Server, I am equally at home working with Linux/MacOS and on both Google Cloud and AWS. I have delivered successful projects with a wide range of languages including C#, Python, R, PHP, Go, Ruby and JS. More recently I have spent time working on helping clients transition onto the cloud and have completed several successful projects providing dev ops support for Azure & AWS as well as delivering headless architectures both with Angular & React. I am currently studying for my AWS Architect certification.
    Featured Skill Hadoop
    Machine Learning
    Apache Hadoop
    Database Maintenance
    Enterprise Architecture
    Business with 1-9 Employees
    Python
    PHP
    SQL
    Angular
    .NET Core
    HTML
    .NET Framework
    SCSS
    React
  • $45 hourly
    Data engineer with extensive commercial experience in designing and building cloud native data solutions. Experience & Skills: Python application development ✅ Developing ETL and ELT applications ✅ Reading and writing files ✅ Data manipulation using Pyspark and Pandas ✅ Developing applications on Docker ✅ Writing pytest test cases SQL/Data Warehousing ✅ Snowflake, BigQuery, RDS (PostgreSQL, MySQL, Aurora) ✅ Data warehousing and modelling ✅ Writing complex queries and metrics ✅ Creating dbt models Infrastructure ✅ Serverless architecture design ✅ Event driven architecture design using SNS and SQS ✅ AWS Glue applications with event trigger or cron schedule, including crawlers and Athena table integration ✅ AWS ECS tasks and services to run dockerized applications ✅ AWS Batch jobs to run dockerized applications ✅ AWS Lambda functions with event trigger or cron schedule ✅ Static website hosting on S3 with CDN ✅ AWS EMR to run Apache Spark applications ✅ All infrastructure is provisioned using Terraform Monitoring and alerting ✅ Job monitoring and alerting using Cloudwatch metrics and Grafana CI/CD ✅ CircleCI, GoCD, GitLab CI/CD Version Control ✅ GitHub, GitLab
    Featured Skill Hadoop
    AWS Lambda
    Terraform
    Snowflake
    Data Ingestion
    Grafana
    SQL
    AWS Glue
    Amazon ECS
    Python
    dbt
    CI/CD
    Apache Spark
    Data Modeling
    Apache Hadoop
  • $38 hourly
    I have experience working on a range of different data science projects in many sectors from manufacturing to health tech and genomics. I am proficient in all elements of the data science pipeline and have worked on projects both big and small. I am confident delivering data analysis, warehousing, engineering, visualisation and predictive analytics including machine learning and AI. I am proficient in python, R, Azure and cloud technologies and other big data languages.
    Featured Skill Hadoop
    Industry 4.0
    Project Management
    Geographic Infographic
    Azure Machine Learning
    Science
    Genomic Data Analysis
    Apache Hadoop
    Communications
    R
    Microsoft Power BI
    Python
  • $35 hourly
    A versatile statistician and data professional skilled in R, Java, and Python, proficient in data visualisation for clear communication of statistical insights. Experienced in AI and ML for pattern extraction from complex datasets. Proficient in SQL, MongoDB, NLP, and Hadoop for database management and storage solutions. Adept at developing interactive dashboards using R-Shiny and TensorFlow, with proficiency in project management using Microsoft 365.
    Featured Skill Hadoop
    Statistical Programming
    Statistical Analysis
    Visualization
    NoSQL Database
    R
    Java
    Python
    Scientific Illustration
    Science
    Data Science
    Apache Hadoop
  • $60 hourly
    PhD in Computer Science with strong technical skills and a research track in Linked Data, Databases, and Large-Scale Machine Learning. Highly skilled in Apache Spark, PySpark, Python, SQL, Datalog, and designing scalable data processing pipelines. Passionate about enhancing input data quality in machine learning. For more details, please visit my GitHub or LinkedIn profile.
    Featured Skill Hadoop
    Neo4j
    NoSQL Database
    Query Optimization
    Java
    OWL
    SPARQL
    RDF
    Deep Neural Network
    Regression Analysis
    Machine Learning
    Apache Hadoop
    PySpark
    SQL
    Python
  • $95 hourly
    Summary Data engineering professional with 10 years of experience in data solution development and pipeline optimization, expert in Apache Spark and SQL. Key achievements include the reduction of system runtime by 40% through the development of efficient data flows, and a 30% reduction in query time through optimized database indexing and algorithms.
    Featured Skill Hadoop
    Apache Kafka
    Apache Hadoop
    Microsoft Azure
    Apache Spark
    ETL Pipeline
    ETL
  • $35 hourly
    Principal Data Engineer | Azure/Power BI Specialist | Big Data Expert With over 8 years of experience in data engineering and analytics, I specialize in designing and modernizing data infrastructure to drive actionable insights. As a Microsoft Certified Power BI Data Analyst and AWS Solution Architect, I excel at building scalable data pipelines (Azure Data Factory, Databricks), optimizing performance, and delivering executive-level Power BI dashboards for strategic decision-making. Core Expertise: ✔ Cloud Data Solutions: Azure (Fabric, Synapse, Functions), AWS (Glue, Athena), GCP ✔ Data Pipelines & Warehousing: Medallion Architecture, ETL/ELT, Delta Lake, Teradata ✔ Advanced Analytics: Power BI, Tableau, AI model performance tracking (Afiniti) ✔ Big Data Technologies: Hadoop, Spark, Kafka, Hive, Talend ✔ Agile Leadership: SCRUM, Azure DevOps, mentoring teams, end-to-end project ownership Recent Achievements: Modernized data ecosystems using Microsoft Fabric for unified analytics (Subotrek) Architected a serverless real-time data service with Azure Service Bus/Functions (Qordata) Developed AI model performance dashboards (Power BI) and Kafka-based log pipelines (Afiniti) Optimized cross-system data migrations and Teradata-to-Hadoop archives (Teradata)
    Featured Skill Hadoop
    Project Management
    Change Management
    AWS Application
    Azure DevOps
    Apache Hadoop
    Databricks Platform
    Microsoft Azure
    Microsoft Power BI
    Azure Service Fabric
    SQL
    Data Warehousing
    Artificial Intelligence
    ETL Pipeline
    Data Extraction
    Data Analysis
  • $40 hourly
    I’m a senior data engineer with over 10 years of experience building scalable, cloud-native data solutions across diverse industries. I specialize in designing and optimizing end-to-end data pipelines using modern tools like dbt, Apache Airflow, and Snowflake, and managing large-scale data infrastructure on AWS and GCP. Services Offered: • Modern data stack setup: dbt, Airflow, Snowflake • Cloud data migration: Redshift → BigQuery/Snowflake • ETL/ELT development: Scalable, testable pipelines with scheduling & alerting • SQL performance tuning: Speed up your reporting & dashboards I bring a consulting mindset to every project—asking the right questions, documenting clearly, and delivering quality results. Let’s collaborate to build robust data solutions tailored to your needs.
    Featured Skill Hadoop
    Machine Learning
    PostgreSQL Programming
    Terraform
    Google Cloud Platform
    Snowflake
    BigQuery
    Apache Airflow
    Apache Spark
    Apache Kafka
    ETL Pipeline
    Python
    Apache Hadoop
  • $38 hourly
    Business Intelligence architect and developer with more than 10 years of experience in IT. Started as a hobby – programming small games and tools then turned into profession by programming POS systems, customising generic ETL systems to fit customer needs, turning business user stories into processes and dynamic reports. I have international experience in finding fraudulent activity, saving company's revenue, assisting marketing activities and giving important insights by presenting the data in a user-friendly way. My current academic experience in Big Data Science has given my also quite good experience in Cloud technologies (Azure, Amazon EC2), big data systems (Hadoop, Apache Spark) and Analysis tools (Python – scikit-learn).
    Featured Skill Hadoop
    Microsoft Power BI
    SQL
    Terraform
    AWS Amplify
    AWS Glue
    Amazon Redshift
    AWS Lambda
    Amazon EC2
    SQL Server Integration Services
    Java
    C#
    Python
    Apache Hadoop
    Apache Spark
    Tableau
  • $30 hourly
    I manage a Software development company with 30+ permanent and sub-contract staff. My pool of resources is constant and highly experienced. I manage customer requirements validation and verification of software product deliverables. Hands-on Software development in-house solutions. Price and hourly rate are to be negotiated based on task.
    Featured Skill Hadoop
    User Interface Design
    User Experience Design
    PostgreSQL Programming
    Agile Software Development
    Enterprise Software Development
    Multiprotocol Label Switching
    Apache Hadoop
    Enhanced Interior Gateway Routing Protocol
    Software Development
    Multithreaded, Parallel, & Distributed Programming Language
    C#
    Certified Associate in Python Programming
    Microsoft Hyper-V Server
    SQL Programming
  • $15 hourly
    🔹 About Me I am a data scientist with a master's degree in data science, specializing in data analysis, machine learning, cloud computing, and IoT. My expertise lies in transforming complex data into actionable insights and developing innovative solutions to real-world problems. 🔹 Projects & Experience 1. UK Property Price Prediction: Developed predictive models to analyze and forecast real estate prices across the UK. 2. Accident Severity Analysis: Conducted comprehensive analysis and prediction of accident severity using machine learning techniques. 3. IoT Smart Lock System: Designed and implemented a WiFi-enabled smart lock system, integrating hardware and software components for enhanced security. 4. Data Anonymization Application: Built a robust application using Java and Hadoop to ensure data privacy and compliance. 🔹 Technical Proficiencies 1. Programming Languages: Python, Java 2. Data Analysis & Machine Learning: Pandas, NumPy, Scikit-learn, Matplotlib, Seaborn 3. Big Data Technologies: Hadoop 4. Databases: MySQL 5. Data Visualization & BI Tools: Power BI, Excel I am committed to delivering high-quality, data-driven solutions that drive business success. Let's collaborate to turn your data into strategic assets.
    Featured Skill Hadoop
    Data Analytics
    Apache Hadoop
    Seaborn
    NumPy
    Matplotlib
    Python Scikit-Learn
    Java
    PostgreSQL
    Machine Learning Model
    Python
    SQL
    Data Analysis
    Machine Learning
  • $15 hourly
    Personal Profile Results-driven Data Engineer with 4+ years of experience in designing, optimizing, and managing complex data pipelines across various industries. Skilled in building and maintaining scalable ETL/ELT pipelines, implementing Medallion architectures, and managing data lakes on ADLS Gen2 and S3. Proficient in PySpark, Databricks, Delta Lake, and Spark-based pipelines on AWS EMR using Airflow. Experienced in CI/CD, OLTP/OLAP data modeling, Power BI dashboarding, and AI-driven projects. Adept at enhancing data workflows and ensuring seamless production releases. 🔹 ETL/ELT Development: Expertise in batch and streaming pipelines using PySpark, Airflow, and Azure Data Factory (ADF). 🔹 Big Data Processing: Proficient in Spark, Databricks, and AWS EMR, with hands-on experience handling terabytes of data daily. 🔹 Data Management: Skilled in maintaining Delta Lake, ADLS Gen2, and S3, and designing optimized OLTP/OLAP models. 🔹 CI/CD Automation: Experienced in building and managing CI/CD pipelines with Azure DevOps for seamless releases. 🔹 Real-Time Streaming: Strong knowledge of Spark Structured Streaming and Kafka for real-time data processing. I’m passionate about solving complex data challenges and continuously improving pipeline performance.
    Featured Skill Hadoop
    Amazon EC2
    Amazon Redshift
    Kafa
    Apache Airflow
    Apache Hadoop
    Azure DevOps
    Databricks Platform
    SQL Programming
    Python
    PySpark
    Data Engineering
    Data Extraction
    Data Mining
    ETL Pipeline
    Data Analysis
  • $15 hourly
    DevOps & Data Engineering Expert | AWS | Hadoop | Kubernetes | Automation I’m a senior DevOps and Data Engineer with over 17 years of hands-on experience helping large enterprises, public sector clients, and fast-growing teams automate infrastructure, build scalable data platforms, and streamline software delivery pipelines. 🔧 What I Bring to Your Project: End-to-end CI/CD automation with Jenkins, GitLab CI, and Azure DevOps Infrastructure as Code using Terraform, Ansible, and Puppet Containerization & orchestration using Docker and Kubernetes Big Data platforms: Cloudera, HDP, Spark, Hive, Kafka, Informatica Secure, scalable cloud infrastructure on AWS & Azure Automation of data pipelines, monitoring with Prometheus & Zabbix Deep OS-level expertise (RHEL, CentOS) and virtualization (VMware, RHV) I’ve led complex platform upgrades, built reusable DevOps pipelines for multi-team environments, and delivered stable, high-performance data workflows—saving time, reducing manual errors, and improving system reliability. ✅ Let’s connect if you need a reliable, detail-oriented expert to build, scale, or optimize your DevOps or data platform.
    Featured Skill Hadoop
    Grafana
    Prometheus
    Terraform
    Puppet
    Ansible
    CI/CD
    Docker
    Kubernetes
    Apache Hadoop
    Azure DevOps
    Computer Network
    AWS Development
    AWS CodeBuild
    DevOps
    Database
  • $14 hourly
    Electronics and Computer Engineering graduate with expertise in FPGA design, IoT, and cloud technologies (AWS, Azure). Hands-on experience with Hadoop, Spark, Kafka, and Databricks from my Big Data Engineer Trainee role. Skilled in solving complex engineering problems through real-time systems development and performance optimization. Passionate about leveraging technical skills to drive innovation and enhance developer productivity
    Featured Skill Hadoop
    SQL
    PySpark
    Apache Hadoop
    C++
    MATLAB
    AWS CodePipeline
    Embedded C
    Software Debugging
    Data Analysis
    Data Extraction
    ETL
    ETL Pipeline
  • $100 hourly
    Neeraj Bhadani is currently working as Principal Data Engineer. He has 16+ years of experience in Data Engineering, Data Science and Machine Learning field. He has delivered various training and workshops both internally and externally across his career. He has also published lot of technical blogs on medium. He worked on various Machine Learning and Big Data projects, dealt directly with clients as a Technical specialist, and migrated various ETL pipelines to Apache Spark. He also received a Gold Medal for securing first place in his batch during his undergraduate days.
    Featured Skill Hadoop
    Recommendation System
    Apache Kafka
    SQL
    Python
    Apache Hadoop
    MLflow
    Databricks Platform
    Streaming Software
    Apache Spark
    Data Analysis
    Artificial Intelligence
    Machine Learning
    ETL Pipeline
    ETL
    Data Extraction
  • $25 hourly
    contact info: ✉️ gulati9635@gmail.com 📞 +44-7459020486 • Technical professional (Cloud Data Engineer) with 9 years of experience in software industry primarily in Big Data and Azure Data Engineering. • 5 years of Azure cloud Data engineering experience. • Experience in developing ETL data pipelines on Azure cloud using Azure Data factory, Azure Databricks, Pyspark,, Azure key vault, Azure SQL server, ADLS gen2 , Azure blob Storage • Cleared exams DP-201: Designing an azure data solution, DP-200: Implementing an Azure data solution, Az-900-Microsoft azure fundamentals. • Experience in Big Data technologies like Hadoop, Hive, Impala, spark (with Scala and python), Sqoop. • Experience in ETL tools like Informatica DEI/BDM 10.4.1 and 10.2.1, Power Center (10.4.1 and 10.1.1) ,Power exchange 10.4.1 and 10.1.1. • Experience in Legacy to Big data (HDFS) and cloud migration projects • Certified Microsoft Azure Data Engineer Associate. • Have an additional knowledge of SQL, Unix, Shell Scripting, Python scripting , GIT ,JIRA, Jenkins and Scheduling tool -Autosys .
    Featured Skill Hadoop
    Bash Programming
    Databricks Platform
    Apache Spark
    Apache Hive
    PySpark
    Apache Hadoop
    Big Data
    Microsoft Azure
    IBM Db2
    Python
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.