Hire the best Apache Spark Engineers in Missouri

Check out Apache Spark Engineers in Missouri with the skills you need for your next job.
  • $70 hourly
    I am a Business Analyst and Data Engineer with experience working with Python, R, SAS, SQL, T-SQL, PL-SQL, C#, Spark/Hadoop, PowerBi. I can model data, create databases, reports, and dashboards. I have significant experience creating and managing cloud services in Azure including Databricks, Data Factory, and Data Lake Storage. I have worked with various relational databases such as MSSQL, MySQL, Oracle, and Snowflake to build out warehouse and reporting solutions. I have Graduate Certificate in Data Science from Harvard University. I can architect a full cloud ETL solution or simply provide code for your existing environment or projects.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    ETL Pipeline
    C#
    Databricks Platform
    Apache Hadoop
    Microsoft Azure SQL Database
    Microsoft Azure
    Data Science
    Python
    SQL
    SAS
    R
  • $500 hourly
    I excel at analyzing and manipulating data, from megabytes to petabytes, to help you complete your task or gain a competitive edge. My first and only language is English. My favorite tools: Tableau, Alteryx, Spark (EMR & Databricks), Presto, Nginx/Openresty, Snowflake and any Amazon Web Services tool/service (S3, Athena, Glue, RDS/Aurora, Redshift Spectrum). I have these third-party certifications: - Alteryx Advanced Certified - Amazon Web Services (AWS) Certified Solutions Architect - Professional - Amazon Web Services (AWS) Certified Big Data - Specialty - Amazon Web Services (AWS) Certified Advanced Networking - Specialty - Amazon Web Services (AWS) Certified Machine Learning - Specialty - Databricks Certified Developer:
 Apache Spark™ 2.X - Tableau Desktop Qualified Associate I'm looking for one-time and ongoing projects. I especially enjoy working with large datasets in the finance, healthcare, ad tech, and business operations industries. I possess a combination of analytic, machine learning, data mining, statistical skills, and experience with algorithms and software development/authoring code. Perhaps the most important skill I possess is the ability to explain the significance of data in a way that others can easily understand. Types of work I do: - Consulting: How to solve a problem without actually solving it. - Doing: Solving your problem based on your existing understanding of how to solve it. - Concept: Exploring how to get the result you are interested in. - Research: Finding out what is possible, given a limited scope (time, money) and your resources. - Validation: Guiding your existing or new team is going to solve your problem. My development environment: I generally use a dual computer-quad-monitor setup to access my various virtualized environments over my office fiber connection. This allows me to use any os needed (mac/windows */*nix) and also to rent any AWS hardware needed for faster project execution time and to simulate clients' production environments as needed. I also have all tools installed in the environments which make the most sense. I'm authorized to work in the USA. I can provide signed nondisclosure, noncompete and invention assignment agreements above and beyond the Upwork terms if needed. However, I prefer to use the pre-written Optional Service Contract Terms www [dot] upwork [dot] com/legal#optional-service-contract-terms.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    CI/CD
    Systems Engineering
    Google Cloud Platform
    DevOps
    BigQuery
    Amazon Web Services
    Web Service
    Amazon Redshift
    ETL
    Docker
    Predictive Analytics
    Data Science
    SQL
    Tableau
  • $30 hourly
    Currently a Cloud Engineer for a large corporation, working mostly with Microsoft Azure and Python to build out a Big Data Platform to allow for smarter, more well-informed business decisions. • I'm experienced in Python, Microsoft Azure, Databricks, SQL, Apache Spark and Pyspark. • I believe that every job is done better with proper communication, so I always make sure to keep in touch to verify that everything is being done correctly.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Azure DevOps
    Solidity
    PySpark
    Git
    Databricks Platform
    Microsoft Azure
    PLC Programming
    C++
    Python
  • $100 hourly
    DATA ENGINEER & BIG DATA DEVELOPER Resourceful data engineer offering experience designing stable and scalable solutions that enable business decisions within the Information Technology, real estate, and retail industries. Skilled at building an ETL pipelines. Advocate of the lakehouse architecture.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Computer Science
    Databricks Platform
    Java
    SQL
    Cloud Computing
    ETL Pipeline
    Amazon Web Services
    Scala
  • $200 hourly
    Experienced Data Engineer with a demonstrated history of working in the information technology and services industry. Skilled in Apache Spark, Hadoop, AWS, GCP, and data pipeline development. Strong engineering professional with a Bachelor of Science (BS) focused in Software Engineering from Punjab University College of Information Technology. Passionate about leveraging cutting-edge technologies to drive impactful data-driven solutions.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    GIS
    Amazon EC2
    Amazon S3
    Apache HBase
    Elasticsearch
    MongoDB
    Apache Hadoop
    Java
    Scala
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by 5M+ businesses