Hire the best Apache Spark Engineers in Jaipur, IN

Check out Apache Spark Engineers in Jaipur, IN with the skills you need for your next job.
Clients rate Apache Spark Engineers
Rating is 4.7 out of 5.
4.7/5
based on 283 client reviews
  • $30 hourly
    Highly experienced in IT sector with Lead roles. ( IT 10+ Years, after Master of Computer Applications (MCA) ). Stregnth: * Excellent as a Code Developer * Solution Leader. * Problem detecting & solving enthusiast * Proactive in suggesting the client and committed to the word given to the client. * Driven to the core for Speed, Optimization, "Bugs Cleaning" and the Scalability of the Projects. * With me, your company will get the creativity and advantage extra edge over the competition. General Experience: * 7 Years of experience as a software developer. * 3 Years of experience as a Senior * 2 Years of experience as a Team Leader Skills: Java,PHP,Angular,Vue,React,Wordpress,Laravel, Hadoop * Master of Computer Application * Have full stack knowledge of industry Companies and projects: * Samsung-Team Lead * RBS (Royal Bank of Scotland)- Team Lead * NCR Corporation - Team Lead * Accenture - Developer * Honda Insurance- Sr. Developer
    Featured Skill Apache Spark
    Oracle Database
    Agile Software Development
    Apache Hive
    Hibernate
    MongoDB
    Scrum
    Apache Hadoop
    J2EE
    Machine Learning
    Git
    Apache Struts 2
    Web Service
    Apache Kafka
    Spring Framework
  • $60 hourly
    Experienced Software Solutions Architect with a demonstrated history of working in the sports industry. Skilled in Amazon Web Services (AWS), DevOps, Data Structure, Apache Spark, Blockchain, Elastic Stack (ELK), Android App development and Data streaming. Strong engineering professional with a Master of Computer Applications (MCA) focused in Computer Software Engineering.
    Featured Skill Apache Spark
    Amazon S3
    Amazon EC2
    AWS Elemental
    Docker
    Containerization
    PostgreSQL
    Terraform
    React
    NodeJS Framework
    CI/CD
    Blockchain Architecture
    DevOps
    Engineering & Architecture
    Amazon Web Services
  • $35 hourly
    I'm a seasoned Principal Engineer with extensive experience building high-performance, distributed systems for industry leaders like Mastercard, D. E. Shaw, and other top-tier MNCs in Finance, Healthcare, and Pharma domains. Over the years, I’ve architected and developed scalable solutions across real-time data pipelines, ultra-low latency systems, and large-scale B2B/B2C applications—all optimized for high throughput and reliability. 🛠 Core Skills & Technologies: Languages: Python, Java Big Data: Hadoop, Spark, Spark Streaming Cloud Platforms: AWS, Azure, GCP DevOps: Docker, Kubernetes, PCF Others: Data modeling, analytics tooling, Android app development 🚀 Key Achievements: Built a real-time stream processing pipeline handling 30+ billion events per month Developed a sensor ingestion system processing millions of events per minute Engineered APIs capable of supporting millions of concurrent users Created domain-specific data models and analytics tools using Python and modern data tech Developed and launched Android apps on the Play Store with strong user reviews and ratings I specialize in taking complex business requirements and turning them into scalable, cloud-native systems. If you're looking for someone who can both lead technically and deliver hands-on—let’s connect and make it happen.
    Featured Skill Apache Spark
    PostgreSQL
    FastAPI
    JavaScript
    React
    Big Data
    Apache Kafka
    Software Design
    Docker
    MongoDB
    Akka
    Amazon Web Services
    SQL
    Python
    Java
  • $15 hourly
    Hello, I have more than 5 years of experience in architecting robust and large-scale data pipelines, optimizing data workflows, and delivering actionable insights to drive business growth. I am comfortable with all the tools available and all the cloud platforms. I have expertise in Pyspark, SQL, Python, Snowflake, DBT, Airflow, AWS, GCP, AZURE, Databricks, and data modeling too. What I offer: Data Pipeline Development: From ingestion to transformation and storage, I design efficient pipelines tailored to your specific needs. Database Management: Expertise in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB) for seamless data management. Big Data Technologies: Proficiency in Hadoop, Spark, and other big data frameworks for processing large-scale datasets. Data Warehousing: Building and maintaining data warehouses (e.g., Amazon Redshift, Google BigQuery) for advanced analytics. ETL Processes: Implementing Extract, Transform, Load processes to ensure data integrity and reliability. Why choose me? Proven Expertise: Over 5 years solving complex data challenges across various industries. Quality Assurance: Ensuring high-quality deliverables that meet your expectations and business goals. Client Satisfaction: Dedicated to clear communication and delivering projects on time and within budget.
    Featured Skill Apache Spark
    Apache Airflow
    Amazon S3
    AWS Glue
    Java
    Databricks Platform
    AWS Lambda
    Docker
    Load Testing
    PostgreSQL
    Big Data
    ETL
    Python
    ETL Pipeline
    SQL
  • $20 hourly
    Big Data Engineer with significant knowledge in Java, Python, Apache Spark, Hive, HDFS, SQL, REST APIs, etc. My experience includes working with Databricks and different cloud platforms, with a focus on data modelling, warehousing, and Delta Lake installations. I excel in designing and developing scalable data solutions, optimising ETL pipelines, and managing a variety of data sources and formats. My background involves developing efficient data storage solutions, integrating complicated systems, and providing high-performance data processing and transformation in both cloud and on-premises environments.
    Featured Skill Apache Spark
    Microsoft Azure
    Databricks Platform
    Data Profiling
    Apache Kafka
    JSON
    CSV
    Parquet
    Hive
    PySpark
    Data Engineering
    Apache Hadoop
    SQL
    Python
    Java
  • $50 hourly
    Experienced DW/BI Technical Lead with 5+ years in Azure ETL and data modeling across cloud and open-framework OLTP/OLAP environments. Have previously worked on, Azure Data Factory Azure Databricks PySpark T-SQL Building Lakehouse Architecture Orchestrating ETL Pipelines Enhancing current data plaforms with data gorvernance using Azure Purview and Unity Catalog
    Featured Skill Apache Spark
    Microsoft Azure SQL Database
    Data Lake
    PySpark
    Data Migration
    ETL Pipeline
    Data Warehousing & ETL Software
    Transact-SQL
    Microsoft Azure
    Databricks Platform
  • $39 hourly
    You can set up free consultation using: calendly.com/gaurav-soni226/gaurav-consultation-1-1 Hello, I am Data Architect and Big Data Engineer with extensive experience building large-scale Analytics solutions starting from Solution Architecture design to Implementation and Subsequent maintenance. The solution includes building and managing cloud Infrastructure. EXPERIENCE 9+ years working in data warehousing, ETL, Cloud Computing (Google Cloud Platform & AWS), and Real-time streaming. MY TOP SKILLS - Python, Java, Scala, SQL, TSQL, HQL - Apache Spark, Flink, Kafka, NiFi, Hive, Presto, Apache Beam (DataFlow) - Azure: Azure Databricks, Azure Data factory, Azure Synapse, Azure Datawarehouse - GCP: Google Dataproc, BigQuery, BigTable, Cloud storage, Cloud Pub/Sub - AWS: EMR, Redshift, DynamoDB, AWS Glue, AWS Athena, Kinesis Streams, S3 - File Format: Parquet, AVRO, CSV, JSON - Other: Data Migration, Snowflake, Pandas, Pyarrow, Delta Lake - Cloud Infra: Kubernetes, GKE, Azure Kubernetes services, EC2, Lambda functions House of Apache Spark: - Spark Jobs tunning: Executors, Core, Memory, Shuffle Partitions, Data Skewness - Spark-SQL: Catalyst Optimizer and Tungsten Optimizer - Spark-MLlib: Machine Learning with Pyspark - Streaming : Spark Structured Streaming(Dataframes), Spark Streaming(RDD) Data Store: - SQL: PostgreSQL, MySQL, Oracle, Azure SQL, DynamoDB - No-SQL: Cassandra, Elasticsearch ILM, OpenSearch ISM, Mongo DB, Hbase - File system: HDFS, Object Storage, Block Storage(Azure Blob, AWS S3) Data Orchestrator: - Apache Airlfow, Apache Oozie Workflow, Azkaban Authentication: - Azure Active Directory - LDAP - Kerberos - SAML Next Steps 👣 Requirements Discussion + Prototyping + Visual Design + Backend Development + Support = Success!
    Featured Skill Apache Spark
    Data Migration
    Amazon Redshift
    Apache Hadoop
    PySpark
    Microsoft Azure SQL Database
    AWS Glue
    ExpertKnowledge Synapse
    Apache Airflow
    Data Modeling
    NoSQL Database
    Databricks Platform
    Scala
    Apache Hive
    Azure IoT HuB
    Elasticsearch
    SQL
    Python
  • $30 hourly
    ⭐️⭐️⭐️⭐️⭐️ Prepare to meet your Salesforce Developer expert! 🏆 In the top 1% of all Upwork Talent with 𝐄𝐱𝐩𝐞𝐫𝐭- 𝐕𝐞𝐭𝐭𝐞𝐝 identified through a screening process 🏆 With over 8 years of mastering Salesforce development, I specialize in crafting seamless solutions that harmonize with the Salesforce ecosystem, spanning Sales, Service, Experience, Non-Profit, NPSP, and Marketing Clouds. My skill set extends to Lightning Web Components Development, where I infuse efficiency and innovation into your projects. I can assist with everything from simple configurations to complex Salesforce buildouts and integrations. My experience spans working with companies of all sizes, from 2 users to over 12,000. Expert-Vetted Salesforce Freelancer on UpWork with 8 years of experience in Salesforce & 10,000 hours & 50+ Projects. ✅ Current location Jaipur, India | Specializing in Salesforce Sales Cloud, Service Cloud, Pardot, Community Cloud, Experience Cloud, Non-Profit Cloud, CPQ, QuickBooks, Marketing Cloud, Einstein Analytics, and many more. ✅ Salesforce Development Services:- - Apex Classes - Apex Triggers - Flows - Visual Force pages - LWC Pages - Lightning Aura - Integrations (WSDL / JSON/ XML) - Rest API - Web services - Developing AppExchange Products - J Queries, Java Scripts ✅ Salesforce Consulting Services:- - AppExchange Tools - Administrator and End-user Training Documents - Consulting on moving to Salesforce.com - Go Live Document ✅ Salesforce Administration Services:- - Configuring the Vanilla Salesforce.com - Object/Field Creation - Setting up Users, Roles & Profiles - Creating Flows - Loading data via Data Loader - Various Reports & Dashboards - Chatter Functionality - Data Cleansing - Setting up Campaigns & Email Marketing Tools Why me: ✅ Client-Centered: I prioritize client satisfaction, value, and trust. ✅ Over-delivering: I consistently exceed client expectations. ✅ Responsive: I maintain open communication for efficient service. ✅ Resilient: I tackle and resolve client issues effectively. ✅ Kindness: I treat everyone with respect and aim to improve their situations. #salesforce #sfdc #lightning #marketingcloud #einsteinai #api #certifieddeveloper #salescloud #pardot #upworktopfreelancer #mostseller #toprated #lightning #sales cloud #service cloud #marketing cloud
    Featured Skill Apache Spark
    Salesforce CPQ
    Java
    Scala
    Salesforce Marketing Cloud
    Salesforce Sales Cloud
    Visualforce
    Apex
    Big Data
    Salesforce Wave Analytics
    R
    Salesforce CRM
    Salesforce Lightning
    Salesforce App Development
    Salesforce Service Cloud
  • $30 hourly
    I'm Gulshan, an Associate Data Engineer with over 3 year of experience in building and managing data pipelines. I specialize in ETL processes, data ingestion, and transformation using Databricks and Azure services. My work focuses on ensuring data is reliable, secure, and accessible for businesses to make informed decisions. I'm also skilled in data governance with Unity Catalog on Databricks, managing data access and security. With expertise in SQL and Python, I automate workflows and optimize data systems, helping organizations get the most out of their data.
    Featured Skill Apache Spark
    Data Modeling
    Data Migration
    Data Lake
    Azure Service Fabric
    SQL
    Databricks Platform
    Azure Cosmos DB
    Algorithms
    Data Structures
    pandas
    PySpark
    Data Analysis
    Data Engineering
    Python
  • $30 hourly
    🚀 Data Engineering Expert | ETL, Big Data, Cloud (Azure & AWS) I am a Senior Data Engineer with a proven track record in designing scalable ETL pipelines, optimizing big data workflows, and implementing cloud-based data solutions. With expertise in Azure Data Services - Databricks, Data Factory, Synapse Analytics, and ETL tools like IBM DataStage, Apache Airflow, I help businesses turn raw data into meaningful insights. Key Skills: ✅ ETL Development & Data Pipelines (Azure Data Factory, Apache Airflow, IBM DataStage) ✅ Big Data Processing (Spark, Hadoop, Databricks) ✅ Cloud Data Solutions (Azure, AWS) ✅ Data Warehousing & Modeling (Star Schema, Snowflake Schema) ✅ Database Management (SQL) ✅ CI/CD & Automation (Jenkins, Ansible, Docker, Kubernetes) If you’re looking for a reliable, efficient, and highly skilled data engineer to build or optimize your data infrastructure, let’s connect! 🚀
    Featured Skill Apache Spark
    Ansible
    Docker
    Python
    SQL
    Apache Airflow
    Apache Hadoop
    PySpark
    IBM DataStage
    Data Warehousing
    Data Engineering
    Databricks Platform
    ETL Pipeline
    Microsoft Azure
    ETL
  • $30 hourly
    I am a high proficient Java Developer and Software Engineer with proven expertise in object-oriented analysis and design and exceptional record overseeing all facets of Software Development Life Cycle, from analysis and design to implementation and maintenance. I have 8.3 years of experience in requirement gathering, analysis, conceptualizing, design, development, coding technical solutions using Java/J2EE technology stacks, Java web service, deployment, testing, documentation and maintenance of Java/J2EE web applications. Strong Programming Skills in designing and implementation of multi-tier applications using Java, J2EE, ,ELK stack, kafka, JDBC, JSP, JSTL, HTML, Spring 3/4/5(Annotation and MVC Framework), Hibernate 4/5, ORM framework, Java design patterns, EXT-JS, AngularJS, Jquery, Mobile Jquery JavaScript, Servlets, CSS. Strong experience in Design, build, deploy, maintain and enhance ELK platform.I have experienced in using Elastic search Indices, Elastic search APIs, Kibana Dashboards, Log stash and Log Beats and using or creating plugins for ELK like authentication and authorization plugins.good knowledge of Shell, PowerShell etc. Diverse experience utilizing Java tools in business, Web, and client-server environments including Java Platform, Enterprise Edition (Java EE)), JSP, Java Servlets ,Spring, Hibernate, and Java database Connectivity (JDBC) technologies. Have strong documentation skills and involved in creating technical design documents, project specification documents and Unit Test Plan documents with ISO 9001 quality standards. Effective communicator and worked well with people at different positions such as business partners, technical specialists to end users. Have strong analytical, technical and logical ability to work independently or in a team environment with the ability to coordinate and provide timely business solutions. Strong communication skills, enthusiastic & self-driven, short learning curve and strong team work spirit quality with a high degree of commitment.
    Featured Skill Apache Spark
    Linux
    Angular 6
    ELK Stack
    Splunk
    Apache Kafka
    Web Service
    Java
    Apache Camel
    Spring Boot
    Hibernate
    Docker
  • $12 hourly
    I am data analysis experience in building a creative dashboard to show data. I know python, SQL, EXCEL, Power BI, Numpy, pandas, Apache Spark
    Featured Skill Apache Spark
    Seaborn
    Matplotlib
    PandasAI
    Python Numpy FastAI
    Apache Hadoop
    Microsoft Excel PowerPivot
    MySQL
    Microsoft Power BI Data Visualization
    Data Mining
    Data Analysis
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Spark Engineer near Jaipur, on Upwork?

You can hire a Apache Spark Engineer near Jaipur, on Upwork in four simple steps:

  • Create a job post tailored to your Apache Spark Engineer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Spark Engineer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Spark Engineer profiles and interview.
  • Hire the right Apache Spark Engineer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Spark Engineer?

Rates charged by Apache Spark Engineers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Spark Engineer near Jaipur, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Spark Engineers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Spark Engineer team you need to succeed.

Can I hire a Apache Spark Engineer near Jaipur, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Spark Engineer proposals within 24 hours of posting a job description.