Hire the best Apache Spark Engineers in Oregon

Check out Apache Spark Engineers in Oregon with the skills you need for your next job.
  • $150 hourly
    As a Data and Business Intelligence Engineer, I strive to deliver consulting and freelance data engineering services, with a focus on overseeing and executing projects in alignment with customer needs. With services encompassing the full data journey, I create and implement robust data foundations that streamline process development and enable leaders to execute rapid business decisions. Three categories of service include: • Consulting: Data Strategy Development, Data & Reporting Solution Architecture, and Process Development. • Data Products/Engineering: Dashboard Development & Reporting, Data Pipelines (ETL), Process Automation, and Data Collection. • Analytics: Key Performance Indicators (KPIs), Metrics, Data Analysis, and Business Process Analysis. Leveraging over eight years of experience in business intelligence, data visualization, business analysis, and requirements analysis, I build data pipelines and translate data into actionable insights that provide a competitive edge. Tools of choice include Amazon Web Services (AWS), Databricks, Snowflake, Kafka, Snowpipe Streams, Airflow, Tableau/PowerBI, SQL, NoSQL, APIs, Python, and Spark/PySpark. Let me know what I can do for YOU!
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    API
    Data Analysis
    Database
    Amazon Web Services
    Business Analysis
    Snowflake
    Databricks Platform
    ETL Pipeline
    Python
    Apache Airflow
    Dashboard
    Tableau
    SQL
  • $175 hourly
    I am an expert in solving complex engineering problems using open-source technologies on the cloud. I am an advocate for infrastructure as code, containerization, API microservices, continuous integration and deployment, and proper use of version control systems. I can quickly analyze high-level system architectures, as well as deep dive into the actual code. My preferred languages are Python, Java, SQL, Javascript, and Bash, but I also have industry experience working with C, C++, C#, Objective-C, Ruby, Scala, Kotlin, R, Matlab, and more. Given my past employment history, I have developed particular expertise in patent analytics, time series analytics, AWS and GCP ecosystems, Kubernetes, and big data processing with Apache Spark. I am extremely excited and passionate about working with startups and founders to solve difficult problems that deliver value to the market.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Data Scraping
    Microsoft Azure
    DevOps
    Jenkins
    Classification
    PyTorch
    GitHub
    TensorFlow
    Kubernetes
    Amazon Web Services
    Google Cloud Platform
    Machine Learning
    CI/CD
    Python
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by 5M+ businesses