Hire the best Apache Spark Engineers in Kathmandu, NP

Check out Apache Spark Engineers in Kathmandu, NP with the skills you need for your next job.
  • $50 hourly
    With over 5+ years of experience, I've expertise in the following areas:- - Architecting Distributed Database clusters & Data pipelines for Big Data Analytics and Data Warehousing using tech stacks which include but are not limited to Citus Data, Redshift, Spark, Kinesis, Trino/PrestoSQL, Athena, Glue, Hadoop, Hive, S3 Data lake . - Database Administration, Setup, Maintenance, Data migration, Backup&Recovery, Monitoring, Replication, Performance Tuning, and Query Optimization of Postgres, MySQL, Oracle, and MongoDB Databases. - Python, Bash, and SQL scripting for database management and automation. - Architecting your next enterprise-level software solution - Web application and API development with Python/Django/Flask | Nodejs/Expressjs - Configure and maintain AWS services like EC2, ECS, Lambda, S3, CodePipeline, etc. - Linux Server administration for setup and maintenance of services on cloud and on-premise servers. - Creating scripts to automate tasks, web scraping, and so on.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Django
    Database Architecture
    Kubernetes
    Linux System Administration
    DevOps
    Big Data
    Database Administration
    Python
    MySQL
    ETL
    PostgreSQL
    SQL
    Database Optimization
    Database Design
    Data Migration
  • $40 hourly
    Experienced Big Data Engineer / Development Manager seasoned to develop BI infrastructure. With proficiency in Python and Scala, I have developed ETL pipelines to automate ingestion of data for data analysis, configured big data technologies using Apache Spark on kubernetes and EMR, Trino/ presto as well as fine tuned processes for performance at scale. Currently, my major responsibilities include, building data infrastructure, developing data pipelines, establishing coding conventions, defining processes and mentoring sub ordinates to the follow the same.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    React
    Business Intelligence
    Data Warehousing & ETL Software
    JasperReports
    PHP
    Amazon Redshift
    PostgreSQL
    Data Lake
    Mystery
    Talend Open Studio
    Scala
    Python
    ETL Pipeline
  • $25 hourly
    Data Engineer specialized in Python and Big Data. Data Warehousing, Datalake, ETL, Automation using Scripting Skills and Expertise: - Programming Languages: Python, Scala, SQL. - Frameworks: Flask, FastAPI, Spark. - Database: MySQL, PostgreSQL, AWS Redshift, AWS RDS. - Cloud: AWS - Scripting and REST APIs. - ETL and ML. - Expertise in computer vision specialized in deep learning. I will be very glad to work with you. Lets collaborate.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    API
    AWS Glue
    Amazon Athena
    Django
    Amazon Redshift
    Data Scraping
    Amazon S3
    SQL
    Flask
    Python
    Computer Vision
    Deep Learning
  • $20 hourly
    Software backend developer with experience in Java, Hadoop, Elastic Search and Spark. With my proper analytical skill set and keen interest to learn more, I can perform any task in timely manner.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Computing & Networking
    Mathematics
    Apache Hadoop
    Elasticsearch
    Java
    Python
  • $10 hourly
    I am a data engineer with the skills and experience in: Design, develop, and maintain data pipelines to extract, transform, and load data from various sources. Implement data quality checks, transformation, analysis and also prediction model Familiar with AWS Services like S3, EC2, RDS, Redshift, DynamoDB. Working knowledge of Python, SQL(MySQL, PostgreSQL, Oracle), Machine Learning Algorithms
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Model Deployment
    Machine Learning Algorithm
    SQL
    Amazon Redshift
    Amazon RDS
    Amazon DynamoDB
    Amazon Athena
    Apache Hadoop
    Big Data
    Statistics
    Machine Learning
    Python
    ETL
    Data Analysis
  • $10 hourly
    I am from Nepal .I am new to upwork you can test me I can deliver a quality result .I have a bachelors degree in computer engineering. I am fluent in Python,sql(Mysql,mssql,mariadb,sqlite),snowflake,dbt,kafka. I also have experience of laravel and php for web develpment.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Data Warehousing & ETL Software
    Pentaho
    Microsoft Power BI
    Tableau
    Apache Kafka
    Snowflake
    Databricks Platform
    SQLite
    Microsoft SQL Server
    MySQL
    Laravel
    PHP
    Python
    C
  • $12 hourly
    * I am MSc.IT graduate in Data Analyst from London Metropolitan University having 3+ years of experience in data analysis. * I am well know to Big Data tools and technologies like Hadoop, Spark, Flume, MapReduce along with SQL query. * I have done various project for data analysis and visualization using Python and its libraries like Pandas, Matplotlib, Seaborn and NumPy. * I have done data analysis and visualization using R and its libraries like Dplyr, Ggplot2, Knitr and Caret.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Data Extraction
    Data Warehousing
    Data Cleaning
    Big Data
    Machine Learning Model
    Data Analysis
    Python
    Data Model
    Data Science
    Statistics
    Machine Learning
    Web Application
    Analytics Dashboard
    Data Analytics & Visualization Software
    Apache Hadoop
    Data Entry
    Data Collection
    R
    Data Scraping
    Interactive Data Visualization
    Data Analytics
    Data Visualization
    SQL
  • $20 hourly
    As a data enthusiast, I have always enjoyed working in challenging and learning environments, utilizing my skills and knowledge to the best of my abilities, adding value to the organization that I represent and serve, while also concurrently upgrading my skills and knowledge
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    AWS Glue
    AWS Lambda
    ETL Pipeline
    dbt
    Data Warehousing
    Databricks Platform
    Microsoft Azure
    Data Modeling
    Apache Airflow
    Snowflake
    Python
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Spark Engineer near Kathmandu, on Upwork?

You can hire a Apache Spark Engineer near Kathmandu, on Upwork in four simple steps:

  • Create a job post tailored to your Apache Spark Engineer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Spark Engineer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Spark Engineer profiles and interview.
  • Hire the right Apache Spark Engineer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Spark Engineer?

Rates charged by Apache Spark Engineers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Spark Engineer near Kathmandu, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Spark Engineers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Spark Engineer team you need to succeed.

Can I hire a Apache Spark Engineer near Kathmandu, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Spark Engineer proposals within 24 hours of posting a job description.