Hire the best Apache Spark Engineers in Ghaziabad, IN

Check out Apache Spark Engineers in Ghaziabad, IN with the skills you need for your next job.
Clients rate Apache Spark Engineers
Rating is 4.7 out of 5.
based on 283 client reviews
  • $30 hourly
    I am Java Developer with more than 14 years of experience in system design, development and implementation of small, medium, and large software projects. I also have more than 5 years of experience in project leadership and Agile methodology. Technical Skills: Languages : Java.python Web Services : SOAP/REST, Apache CXF, JAX-WS, JAX-RS, Unirest, SOAP UI. Container : Docker and Kubernetes EIP Technologies : Apache Camel. Caching Technologies : Redis, Infinispan. Spring Framework : Spring DI/IOC, Spring Data , Spring Scheduler, Spring MVC. JMS Technologies : Apache ActiveMQ. ORM Technologies : JPA, Hibernate. Web Technologies : JSP, Servlets, EL, JSTL, JavaScript, jQuery, Struts, HTML, CSS. Applications Servers : Apache Tomcat , Wildfly. Continuous Integration : Jenkins. Reporting Technologies : JasperReports. Operating Systems : Linux, Windows XP/NT/2000. RDBMS : MySQL , SQL Server 2005, Postgres. BigData/NoSql : Hadoop , Apache Flume , Hbase ,MongoDB , Spark , Zoo Keeper IDE : Intellij, Eclipse. Source Controls : Mercurial, Git , SVN Build Tools : Maven, Apache Ant 1.6. Indexing Framework : Apache Solr , ElasticSearch
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Python Script
    AWS Lambda
    Amazon API Gateway
    RESTful Architecture
    Data Structures
    AWS Glue
    Spring Framework
    Apache Hadoop
    Amazon DynamoDB
  • $8 hourly
    I am an experienced professional in the field of data Engineering and Analytics. I have worked in Domains / sectors like BFSI, Telecom and Consulting (big4). I am skilled in Databricks, pyspark, sparkSQL, Python - pandas, numpy, sklearn etc , Excel, SQL, Tableau, AWS - Sagemaker, Athena, Redshift, S3, Lambda, ML, NLP, Data Engineering and Analytics, data visualization ,ETL and business analysis Others - openai, conv ai - Lex, Dialogflow Projects :- 1. Created end to end data pipelines on Databricks + Unity Catalog for a Telecom client - pyspark, SQL 2. Conversational AI - Helped improved implementation of chatbot on AWS Lex - python 3. Helped a banking client to create a KMEANS segmentation model on the collections data and further mapping it to the regions to get better understanding of customer behaviour. 4. Migrated 150+ SAS codes and 20+ credit risk reports from SAS EG platform to SAS VIYA - AWS cloud based platform. Then conversion to python. 5. Developed ETL pipeline on python to create data visualization on Tableau by ingesting data from various sources 6. Created late fee reporting's structure end to end pipelines for a renowned bank - pyspark, python, bigdata, hive
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Big Data
    Data Warehousing
    AWS Glue
    Data Analysis
    Databricks Platform
  • $25 hourly
    Experienced data engineer with a demonstrated history of working in the information technology and services industry. Skilled in Data/ML Engineer. Strong engineering professional with a Master's Degree focused in Software
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Amazon SageMaker
    Amazon S3
    Data Warehousing
    Google AutoML
    AWS Glue
    Apache Kafka
    Data Migration
    Data Lake
    Amazon Redshift
  • $15 hourly
    I am a seasoned senior Java and JavaScript developer with over 8+ years of hands-on experience, coupled with 3+ years in a team lead role. My expertise spans various domains, and I am particularly confident in handling big data, designing low-latency microservices, managing networking, and excelling in code generation and compilation. Key Strengths: ✅ Technical Leadership: Led and managed a diverse team for more than 5 years, adhering to Agile methodologies. Developed solutions with a keen focus on robustness, scalability, and efficiency. ✅ Domain Expertise: Extensive experience in the telecommunications sector, having worked directly with large operators during a year-long tenure as a Level 3 support engineer. ✅ Technological Proficiency: ✅ Networking and Protocols: HTTP/1.1, HTTP/2, Websocket, SNMP, AMQP, DNS. Nginx with Lua scripting experience. ✅ Big Data Technologies: Kafka, RabbitMQ, ActiveMQ, Zookeeper, Spark, Hadoop. Proficient in OLAP for analytical processing. ✅ Cloud Services: AWS services (EC2, Lambda, Redshift, S3/Glacier), Serverless architecture. ✅ Java Development: Spring stack, security, Hibernate, Servlet, Jetty, Netty, OSGI, JAX-RS, JMS, GWT, Guava, Guice, JUnit. Build automation with Maven, Ant, Gradle. ✅ JavaScript Development: Node.js, Serverless, ES6, Ramda. ✅ Frontend Development: Angular, React, Bootstrap, Ionic. Build tools: Webpack, Parcel. ✅ Database Management: Oracle, PostgreSQL, SQL Server, MySQL, Teradata, Vertica, SAP HANA, Clickhouse. ✅ NoSQL databases: Mongo, Elasticsearch, Couchbase. How do we communicate and report? We always prefer to keep our work transparent, and for this reason, we like to be in touch with our valued clients on a daily, weekly & monthly basis. According to the requirement, we use many types of project management tools for the blockchain development project's communication. ✅ Upwork Message Board ✅ JIRA ✅ Azure DevOps Server ✅ Asana ✅ Trello ✅ Zoho Projects
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    AWS Development
    AWS Lambda
    Serverless Stack
    Apache Hadoop
    Eclipse Jetty
    Java Servlet API
    Apache Kafka
    Spring Boot
    Java Collections Framework
    Solution Architecture
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Spark Engineer near Ghaziabad, on Upwork?

You can hire a Apache Spark Engineer near Ghaziabad, on Upwork in four simple steps:

  • Create a job post tailored to your Apache Spark Engineer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Spark Engineer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Spark Engineer profiles and interview.
  • Hire the right Apache Spark Engineer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Spark Engineer?

Rates charged by Apache Spark Engineers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Spark Engineer near Ghaziabad, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Spark Engineers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Spark Engineer team you need to succeed.

Can I hire a Apache Spark Engineer near Ghaziabad, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Spark Engineer proposals within 24 hours of posting a job description.