Hire the best Pyspark Developers in London, ENG

Check out Pyspark Developers in London, ENG with the skills you need for your next job.
  • $60 hourly
    I'm a data scientist with a Master's in Analytics and 3 years of in-industry experience. I have experience in all areas of data science but specialise in: - Developing and deploying machine learning models - Natural language processing - Analysing and visualising data with interactive dashboards - Creating clear, well documented and reusable python code - AWS Certified Cloud Practioner Get in touch and find out how I can help!
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Data Analytics
    GitHub
    Algorithm Development
    Network Analysis
    Analytics
    PySpark
    SQL
    Tableau
    Data Science
    Python
    Machine Learning Model
    Deep Learning
    Machine Learning
    Natural Language Processing
    Amazon SageMaker
  • $50 hourly
    Ex Facebook/Meta Data Engineer providing fully managed analytics solutions for sales, marketing, and finance teams. Is this you? 😢 "Every month, I spend hours manually pulling reports instead of focusing on our strategy" 😢 "I just want to see my team's performance without having to juggle different spreadsheets" 😢 "End-of-month reporting shouldn't feel like assembling a thousand-piece jigsaw puzzle" 😢 "I just need the figures to reconcile. Why is it such a hassle to get consistent data?" 𝗜𝗡𝗧𝗥𝗢 With more than seven years of experience in data and analytics, I have collaborated with notable organisations including Facebook/Meta, HelloFresh, Capgemini, and several thriving startups. My expertise is focused on offering two primary services: Data Engineering and Dashboard Development. 𝗦𝗘𝗥𝗩𝗜𝗖𝗘𝗦 𝗢𝗙𝗙𝗘𝗥𝗘𝗗 🧑‍🔧 #1 Data Engineering: build, test, monitor, and automate a robust flow of data to power your business. #2 Dashboard Development: build dashboards to improve engagement and give you the best chance at causing real actions within your organisation - tools include: Tableau, Power BI, and/or Google Sheets. 𝗘𝗫𝗣𝗘𝗥𝗜𝗘𝗡𝗖𝗘 𝗔𝗡𝗗 𝗘𝗫𝗣𝗘𝗥𝗧𝗜𝗦𝗘 🥇 • Over 7 years of experience in the data and analytics field, working as a data analyst, data engineer, analytics engineer, and data architect working with cloud technologies such as Amazon Web Services (AWS) and Google Cloud Platform (GCP). • Worked with a number of companies such as Facebook/Meta, HelloFresh, Capgemini, Thomas Cook Airlines, Tractable, plus others. Each with their own unique set of projects. • I’ve helped sales, marketing and finance teams automate a range of reports that were once manual and time-consuming such as marketing channel performance, customer acquisition, financial performance and customer support performance. • Technologies I’ve worked with: SQL, Python, Tableau, Power BI, Google Data Studio, Google Sheets, AWS, GCP, plus others. 𝗖𝗔𝗦𝗘 𝗦𝗧𝗨𝗗𝗜𝗘𝗦 📗 Online consumer services business: Worked closely with senior management to gather reporting requirements and developed a suite of Tableau reports following data visualisation best practices. These dashboards allowed everyone in the business to finally automate and track business KPIs with ease. ⭐️ Testimonial: "Ayub is exemplary in his work and delivery. He is quick in understanding the exact requirement, his planning is meticulous and he has an eye for details. He is very good with data visualization and his dashboards have made it easy for our organization to make sense of numbers. I enjoyed working with Ayub and would love to work with him in future as well." E-commerce agency: Built a data pipeline to extract and load live tracking and price history data and built dashboards in Tableau, Power BI, Google Data Studio, and Klipfolio. These dashboards served is used as an analytics offering by the business to their clients to consolidate and present their clients data in a compact and easy-to-digest set of dashboards. ⭐️ Testimonial: "I've worked with Ayub for over a year on some complex data and data visualisation projects in Tableau, Power BI and Klipfolio. I've found him to be very competent and an excellent problem solver, as well as responsive and efficient. Looking forward to working with him again in the future!" 📞 Get in touch for a free 30 minute consultation - that’s $0!
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Data Management
    Amazon Redshift
    ETL
    PySpark
    Amazon S3
    BigQuery
    PostgreSQL
    Data Vault
    Data Modeling
    Apache Airflow
    Apache Spark
    Data Warehousing
    dbt
    Amazon Web Services
    Google Cloud Platform
    Terraform
    Cloud Engineering
    Snowflake
    SQL
    Python
    Data Engineering
  • $50 hourly
    As a seasoned Data Scientist and Technical Product Manager, I bring extensive experience in Financial Crime Risk and Credit Risk management, coupled with deep proficiency in Python, Spark, SAS (Base, EG, and DI Studio), Hadoop, and SQL. Transitioning into freelancing, I am eager to leverage my skills to contribute to diverse projects. While Upwork's guidelines restrict sharing direct links to external profiles, I am happy to provide a detailed portfolio from my LinkedIn upon request.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Data Mining
    Big Data
    Data Science
    Fraud Detection
    Data Analysis
    PySpark
    SAS
    Credit Scoring
    Apache Hadoop
    SQL
    Python
  • $30 hourly
    Adaptable Data Engineer | Building Efficient Data Pipelines and Infrastructure An adaptable data engineer with experience in building efficient data pipelines and infrastructure. With a Ph.D. in modeling and hands-on experience in software implementation for GE Aviation, I offer a unique combination of technical skills and strong communication abilities. Services: I specialize in designing and developing end-to-end data pipelines, optimizing databases, implementing data warehousing solutions, and integrating diverse data sources. Value: I deliver scalable data solutions for informed decision-making and business growth. My strong communication skills ensure effective collaboration with both technical and non-technical stakeholders. Let's connect and leverage your data assets to drive success.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Database Design
    Report Writing
    Database Management
    Content Writing
    Email Copywriting
    Physics Tutoring
    Business Mathematics
    Content Creation
    Mathematics
    ETL Pipeline
    Scientific Writing
    Mathematics Tutoring
    Data Engineering
    Sales Copywriting
    PySpark
    Database Management System
    Data Extraction
    Distributed Database
    Physics
    Automated Deployment Pipeline
    Technical Writing
    Business Intelligence
    Database Query
    Data Warehousing & ETL Software
    Website Copywriting
    Data Wrangling
  • $30 hourly
    Hi there! I have over 4 years of experience in Data Engineering and Data Analytics. I use Python as my daily driver, and I regularly work with technologies and frameworks like SQL, Azure Databricks, Azure Data Factory, Azure Synapse Analytics and PowerBI. I can help you with tasks like Data Extraction, Data Cleaning, Data Transformation, Data Analysis and Data Visualisation. Feel free to reach out if you'd like to discuss your project with me! Languages - Python, SQL Cloud Tools - Azure Databricks, Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage Data Processing, Transformation and Analysis - Apache Spark, PySpark, Pandas Data Visualisation - PowerBI Data Storage Formats - CSV, Microsoft Excel, Google Sheets, Parquet Others - Jupyter Notebook, ipynb
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Algorithm Development
    Data Management
    Java
    Data Analysis
    Data Structures
    Resume
    Interview Preparation
    Candidate Interviewing
    Machine Learning
    Data Science
    Career Coaching
    PySpark
    Apache Spark
    Python
    SQL
  • $15 hourly
    🚀 Data Professional 8+ years of hands-on experience in data engineering, architecture, data modelling, software quality assurance, machine learning, business intelligence, data reporting and application development. 💡 Certified Data Engineer with 8+ Years of Experience 💡 With over a decade of expertise in propelling businesses forward through data-driven strategies, I specialise in providing comprehensive data consultancy and services aimed at unlocking the full potential of your data. My primary focus is on empowering businesses to make strategic decisions and enhance operational efficiency. 🔍 Forward-Thinking Analytical Approach 🔍 With a creative and analytical mindset, I excel at translating business challenges into technical solutions. My diverse skill set is focused on driving growth and improving ROI, ensuring that your data initiatives yield tangible results. 📊 Research Data Analyst Expertise 📊 I specialise in conducting historical and diagnostic data analyses to uncover meaningful insights and trends. With a keen eye for detail and a deep understanding of data patterns, I translate complex data into actionable recommendations for informed decision-making. 💡 Let's Transform Your Data Landscape 💡 Whether you're looking to optimise your data infrastructure, harness the power of predictive modelling, or gain deeper insights through advanced analytics, I'm here to help. With a track record of success in data engineering, analytics, and research, I'm committed to driving your business forward through data-driven innovation. 💼 Data Professional with a Personal Touch 💼 Are you searching for top-notch data services to elevate your business to new heights? Look no further! With a comprehensive range of personal skills and expertise, I offer tailored solutions to meet your data needs and drive success. 🛠️ Skills and Expertise 🛠️ 🥇 Domain Experience ⚡ Insurance / Healthcare / Banking / Telecom / Microfinance / Fintech / Retail / E-commerce ✅ Amazon Web Services ⚡ Redshift / RDS / S3 / Athena / Glue / Lambda / QuickSight / EC2 ✅ Microsoft Azure ⚡ Azure Data Lake / Azure Databricks / Azure Data Factory / Synapse ✅ Big Data Stack ⚡ Apache Spark / Apache Hadoop / Apache Airflow / Apache Hive / Apache Kafka / Cloudera / Databricks ✅ Data Visualization ⚡ MS PowerBI / Tableau / Azure Synapse / AWS QuickSight / MS Excel / Google Data Studio / IBM Cognos ✅ Languages & Libs ⚡ SQL / Python / R / Java / PL/SQL / C++ / .Net / JavaScript / JSON / Node.js / PySpark / Pandas / Numpy / Matplotlib / ReactJs / HTML / CSS / Django / ScikitLearn / Tensorflow ✅ ETL & Other Tools ⚡ Teradata / Talend Data Integration / IBM Datastage / Informatics / Alteryx / Apache Airflow / Docker / Jenkins / Confluence / Jira ✅ Databases ⚡ Vertica / IBM DB2 / Teradata / Greenplum / PostgreSQL / Oracle / MySQL / MS SQL / Oracle Goldengate / Cassandra 💡 Why Choose Me ? With a proven track record of success, a passion for innovation, and a commitment to excellence, I'm your go-to partner for all your data needs. Let's streamline your data infrastructure, harness the power of business intelligence, and unlock new opportunities through data-driven insights.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Big Data
    Tableau
    Data Science
    ETL
    Data Visualization
    Apache Spark
    PySpark
    Database
    Python
    SQL
    Data Management
    Data Analysis
    Amazon Web Services
    Microsoft Azure
    Business Intelligence
  • $30 hourly
    Data Engineer with over 4 years of professional experience developing ETL/ML pipelines, APIs, app backends, SQL databases using python. Also know about developing Machine learning models.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    PySpark
    Artificial Intelligence
    Flask
    PyTorch
    Keras
    Python
  • $30 hourly
    I’m all about writing content that’s not just seen, but remembered. Whether we’re talking about giving your brand a voice with some smart Content Marketing, or getting you noticed on the web with SEO that clicks, I’ve got you covered. Check out my portfolio for some sample work that I've written for fictional clients, showing just how I can switch it up to suit your style and speak to your crowd.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Web Development
    R
    Python Script
    Django
    App Development
    Computer Science
    HubSpot
    pandas
    Software
    PySpark
    Firebase
    C++
    Python
    Flutter
    C
  • $20 hourly
    I have 12+ years of experience in Database development with 4 years of experience in AWS Cloud technology with ETL and data modelling OLTP, OLAP and data warehouse systems. Have exposure over data streaming, designing applications using technique 3NF, Dimension, Snowflake, ETL. Data engineering using data pipelines in AWS Cloud technology. I have 4 years of experience in leading technical team in delivering PLSQL, ETL, reporting solutions, environment setup, source code setup, code integration, Release automation using Jenkins-Liquibase, Jira task management, PLSQL training, Database Performance tuning. Development experience across platforms Oracle, Microsoft SQL server, MongoDB, PostgreSQL, Python, Shell Scripting, AWS cloud. Looking for a career in leading team and providing technical solution where I can turn innovations into effective product with the aid of technical and functional abilities
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Data Warehousing & ETL Software
    Python Script
    PySpark
    Apache Kafka
    Amazon Web Services
    SQL
    Python
  • $18 hourly
    As a recent Intern at Bright Network, my focus was on applying object-oriented programming languages like Python to real-world data analytics challenges. My academic journey at Brunel University London, where I am working towards an MSc in Data Science and Analytics, has equipped me with a solid foundation in both theoretical and practical aspects of this dynamic field. With competencies in Python, Django, and a REST API certification, I am adept at building robust backend solutions. Our team at Bright Network leveraged these skills to enhance project outcomes, demonstrating my commitment to contributing to data-driven decision-making processes. As I continue my education, my goal is to merge academic insights with industry experience to deliver innovative analytics solutions.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Neural Network
    Machine Learning Model
    Machine Learning Algorithm
    Analytics
    NumPy
    pandas
    PySpark
    Data Science
    Data Analytics
    Tableau
    R Hadoop
    R
    Python
    Machine Learning
  • $25 hourly
    I am a data engineer with experience in Big data, Computer and information industry, leveraging data and business principles to solve large-scale data infrastructure problems. I was involved with various projects working across the different stages of the data pipeline including acquisition, integration and real-time data marts. I developed hands on experience in configuring and management of computing infrastructures using cloud based applications like AWS and Azure, while deploying and managing etl pipelines
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Unix
    Linux
    Data Modeling
    Database Optimization
    ETL Pipeline
    Database
    Data Analysis
    PySpark
    Cloud Computing
    Python
    SQL
  • $10 hourly
    I am an analytical and highly insightful individual with experience in big data, cloud computing, machine learning, data science and consulting. Proven ability to develop and maintain relationships at all levels and possess the initiative to take up responsibilities, business analysis; making use of Design Thinking Techniques for breakdown, understanding and solution creation. A strong enthusiasm for new technologies and entrepreneurship across numerous industries including technical sales and a keen interest on working with diverse teams.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Git
    Tableau
    Microsoft Power BI
    Python
    PySpark
    Azure DevOps
    Databricks Platform
    Information Analysis
    Tech & IT
    Visualization
    Technical Project Management
    Microsoft Office
    Analytics
    SQL
    Data Visualization
  • $25 hourly
    Senior Data Engineer 8 Years of experience working in IT for banking ,manufacture and energy industry. Having strong background in designing, building and maintaining large scale data systems and pipelines on GCP, Hadoop, Azure Responsibilities as Data engineer in my past roles --------------------------------------------------------------- 1:- Design, build and manage of data pipelines to collect metrics from heterogenous data sources including batch and live stream, store and process large volume of datasets ,Ensure quality/secure data delivery to end users by following data governance principles. 2:- Designing, building, and maintaining the data infrastructure and pipelines that enable organizations to collect, store, process, and analyse large amounts of data. 3. Developing and maintaining data architectures, data pipelines, and datalake and data warehouse 4. Managing data security and privacy, ensuring compliance with relevant regulations 5. Collaborating with data scientists, analysts, and other stakeholders to define data requirements and ensure data quality 6.Monitoring data infrastructure performance and pipelines and troubleshooting issues as they arise 7. Keeping up-to-date with the latest data technologies and best practices and incorporating them into the organization's data infrastructure. 8:- Statistical analysis of large/complex datasets and Troubleshooting complex data-related issues. 9:- Building frameworks to bring to life cutting-edge data ingestion and Analytics solutions that solves the secure data delivery, data privacy/compliance problems in an Modern Evolving Cloud for a Large financial services Organization. 6:- Build and test CICD pipelines to deploy and run containerized big data spark workloads ,Jupiter-hub and Kafka steam data pipelines on Kubernetes for banking project. Skill set and Expertise --------------------------- GCP , Azure Hadoop Apache Spark PySpark Kafka Hive Cloud Storage Dataproc DataFlow Pub/SUB Big query Databricks Data Factory Java , Python , Scala Sql , NoSql Jenkins Docker Kubernetes
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Data Lake
    PySpark
    Data Engineering
    Google Dataflow
    Google Cloud Platform
    Agile Software Development
    Kubernetes
    Docker
    Hive
    Apache Spark
    Apache Kafka
    Apache Hive
    Apache Cassandra
    Databricks Platform
    Apache Hadoop
  • $13 hourly
    I am an accomplished AI Software Engineer with extensive experience in building and deploying advanced AI and machine learning solutions. With a robust background in data engineering and cloud technologies, I specialize in creating scalable, efficient, and high-performing systems. My expertise spans across various domains including natural language processing, data pipeline orchestration, and cloud-based application development. Skills and Expertise: Machine Learning & AI: Proficient in Scikit-learn, Tensorflow, Pytorch, and LangChain. Big Data Tools & Programming: Experienced with Apache Spark (Pyspark), Apache Kafka, Databricks, Python, and SQL. Cloud Platforms: Expertise in AWS, GCP, Azure, and Digital Ocean. Databases: Skilled in PostgreSQL, MySQL, MongoDB, and MSSQL. Back-End Development: Integrated Django/flask backends with frontends and third-party APIs for enhanced user experiences.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Machine Learning
    Data Science
    API Integration
    API Development
    ETL Pipeline
    Flask
    pandas
    Data Analytics Framework
    PySpark
    Python
  • $20 hourly
    Data professional specialized in data engineering, analytics and visualization, using PySpark, Python, SQL, SAS and Tableau. Additional experience with ML, NLP, Google Could etc.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Microsoft Excel
    PySpark
    Microsoft Power BI
    Tableau
    Python
  • $29 hourly
    I am an innovative and result oriented Data and Business Intelligence Analyst with 7+ years of all-round, international and cross-cultural industry experience coming from a software engineering background, and possessing a seamless mix of business intelligence, data science, machine learning, software development and leadership skills enabling me to leverage theory and practice for value creation and organizational success. I can help you! Call me now!
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Enterprise Resource Planning
    Web Application
    Mobile App
    AWS Development
    Dashboard
    Machine Learning
    SQL
    LaMDA
    J2EE
    Java
    GitHub
    pandas
    Python
    PySpark
    Microsoft Power BI
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Pyspark Developer near London, ENG on Upwork?

You can hire a Pyspark Developer near London, ENG on Upwork in four simple steps:

  • Create a job post tailored to your Pyspark Developer project scope. We’ll walk you through the process step by step.
  • Browse top Pyspark Developer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Pyspark Developer profiles and interview.
  • Hire the right Pyspark Developer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Pyspark Developer?

Rates charged by Pyspark Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Pyspark Developer near London, ENG on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Pyspark Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Pyspark Developer team you need to succeed.

Can I hire a Pyspark Developer near London, ENG within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Pyspark Developer proposals within 24 hours of posting a job description.