Hire the best Apache Spark Engineers in Jaipur, IN
Check out Apache Spark Engineers in Jaipur, IN with the skills you need for your next job.
- $30 hourly
- 4.6/5
- (22 jobs)
Highly experienced in IT sector with Lead roles. ( IT 10+ Years, after Master of Computer Applications (MCA) ). Stregnth: * Excellent as a Code Developer * Solution Leader. * Problem detecting & solving enthusiast * Proactive in suggesting the client and committed to the word given to the client. * Driven to the core for Speed, Optimization, "Bugs Cleaning" and the Scalability of the Projects. * With me, your company will get the creativity and advantage extra edge over the competition. General Experience: * 7 Years of experience as a software developer. * 3 Years of experience as a Senior * 2 Years of experience as a Team Leader Skills: Java,PHP,Angular,Vue,React,Wordpress,Laravel, Hadoop * Master of Computer Application * Have full stack knowledge of industry Companies and projects: * Samsung-Team Lead * RBS (Royal Bank of Scotland)- Team Lead * NCR Corporation - Team Lead * Accenture - Developer * Honda Insurance- Sr. DeveloperApache Spark
Oracle DatabaseAgile Software DevelopmentApache HiveHibernateMongoDBScrumApache HadoopJ2EEMachine LearningGitApache Struts 2Web ServiceApache KafkaSpring Framework - $60 hourly
- 5.0/5
- (11 jobs)
Experienced Software Solutions Architect with a demonstrated history of working in the sports industry. Skilled in Amazon Web Services (AWS), DevOps, Data Structure, Apache Spark, Blockchain, Elastic Stack (ELK), Android App development and Data streaming. Strong engineering professional with a Master of Computer Applications (MCA) focused in Computer Software Engineering.Apache Spark
Amazon S3Amazon EC2AWS ElementalDockerContainerizationPostgreSQLTerraformReactNodeJS FrameworkCI/CDBlockchain ArchitectureDevOpsEngineering & ArchitectureAmazon Web Services - $35 hourly
- 5.0/5
- (2 jobs)
I'm a seasoned Principal Engineer with extensive experience building high-performance, distributed systems for industry leaders like Mastercard, D. E. Shaw, and other top-tier MNCs in Finance, Healthcare, and Pharma domains. Over the years, I’ve architected and developed scalable solutions across real-time data pipelines, ultra-low latency systems, and large-scale B2B/B2C applications—all optimized for high throughput and reliability. 🛠 Core Skills & Technologies: Languages: Python, Java Big Data: Hadoop, Spark, Spark Streaming Cloud Platforms: AWS, Azure, GCP DevOps: Docker, Kubernetes, PCF Others: Data modeling, analytics tooling, Android app development 🚀 Key Achievements: Built a real-time stream processing pipeline handling 30+ billion events per month Developed a sensor ingestion system processing millions of events per minute Engineered APIs capable of supporting millions of concurrent users Created domain-specific data models and analytics tools using Python and modern data tech Developed and launched Android apps on the Play Store with strong user reviews and ratings I specialize in taking complex business requirements and turning them into scalable, cloud-native systems. If you're looking for someone who can both lead technically and deliver hands-on—let’s connect and make it happen.Apache Spark
PostgreSQLFastAPIJavaScriptReactBig DataApache KafkaSoftware DesignDockerMongoDBAkkaAmazon Web ServicesSQLPythonJava - $15 hourly
- 5.0/5
- (6 jobs)
Hello, I have more than 5 years of experience in architecting robust and large-scale data pipelines, optimizing data workflows, and delivering actionable insights to drive business growth. I am comfortable with all the tools available and all the cloud platforms. I have expertise in Pyspark, SQL, Python, Snowflake, DBT, Airflow, AWS, GCP, AZURE, Databricks, and data modeling too. What I offer: Data Pipeline Development: From ingestion to transformation and storage, I design efficient pipelines tailored to your specific needs. Database Management: Expertise in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB) for seamless data management. Big Data Technologies: Proficiency in Hadoop, Spark, and other big data frameworks for processing large-scale datasets. Data Warehousing: Building and maintaining data warehouses (e.g., Amazon Redshift, Google BigQuery) for advanced analytics. ETL Processes: Implementing Extract, Transform, Load processes to ensure data integrity and reliability. Why choose me? Proven Expertise: Over 5 years solving complex data challenges across various industries. Quality Assurance: Ensuring high-quality deliverables that meet your expectations and business goals. Client Satisfaction: Dedicated to clear communication and delivering projects on time and within budget.Apache Spark
Apache AirflowAmazon S3AWS GlueJavaDatabricks PlatformAWS LambdaDockerLoad TestingPostgreSQLBig DataETLPythonETL PipelineSQL - $20 hourly
- 4.7/5
- (6 jobs)
Big Data Engineer with significant knowledge in Java, Python, Apache Spark, Hive, HDFS, SQL, REST APIs, etc. My experience includes working with Databricks and different cloud platforms, with a focus on data modelling, warehousing, and Delta Lake installations. I excel in designing and developing scalable data solutions, optimising ETL pipelines, and managing a variety of data sources and formats. My background involves developing efficient data storage solutions, integrating complicated systems, and providing high-performance data processing and transformation in both cloud and on-premises environments.Apache Spark
Microsoft AzureDatabricks PlatformData ProfilingApache KafkaJSONCSVParquetHivePySparkData EngineeringApache HadoopSQLPythonJava - $50 hourly
- 0.0/5
- (0 jobs)
Experienced DW/BI Technical Lead with 5+ years in Azure ETL and data modeling across cloud and open-framework OLTP/OLAP environments. Have previously worked on, Azure Data Factory Azure Databricks PySpark T-SQL Building Lakehouse Architecture Orchestrating ETL Pipelines Enhancing current data plaforms with data gorvernance using Azure Purview and Unity CatalogApache Spark
Microsoft Azure SQL DatabaseData LakePySparkData MigrationETL PipelineData Warehousing & ETL SoftwareTransact-SQLMicrosoft AzureDatabricks Platform - $39 hourly
- 4.9/5
- (21 jobs)
You can set up free consultation using: calendly.com/gaurav-soni226/gaurav-consultation-1-1 Hello, I am Data Architect and Big Data Engineer with extensive experience building large-scale Analytics solutions starting from Solution Architecture design to Implementation and Subsequent maintenance. The solution includes building and managing cloud Infrastructure. EXPERIENCE 9+ years working in data warehousing, ETL, Cloud Computing (Google Cloud Platform & AWS), and Real-time streaming. MY TOP SKILLS - Python, Java, Scala, SQL, TSQL, HQL - Apache Spark, Flink, Kafka, NiFi, Hive, Presto, Apache Beam (DataFlow) - Azure: Azure Databricks, Azure Data factory, Azure Synapse, Azure Datawarehouse - GCP: Google Dataproc, BigQuery, BigTable, Cloud storage, Cloud Pub/Sub - AWS: EMR, Redshift, DynamoDB, AWS Glue, AWS Athena, Kinesis Streams, S3 - File Format: Parquet, AVRO, CSV, JSON - Other: Data Migration, Snowflake, Pandas, Pyarrow, Delta Lake - Cloud Infra: Kubernetes, GKE, Azure Kubernetes services, EC2, Lambda functions House of Apache Spark: - Spark Jobs tunning: Executors, Core, Memory, Shuffle Partitions, Data Skewness - Spark-SQL: Catalyst Optimizer and Tungsten Optimizer - Spark-MLlib: Machine Learning with Pyspark - Streaming : Spark Structured Streaming(Dataframes), Spark Streaming(RDD) Data Store: - SQL: PostgreSQL, MySQL, Oracle, Azure SQL, DynamoDB - No-SQL: Cassandra, Elasticsearch ILM, OpenSearch ISM, Mongo DB, Hbase - File system: HDFS, Object Storage, Block Storage(Azure Blob, AWS S3) Data Orchestrator: - Apache Airlfow, Apache Oozie Workflow, Azkaban Authentication: - Azure Active Directory - LDAP - Kerberos - SAML Next Steps 👣 Requirements Discussion + Prototyping + Visual Design + Backend Development + Support = Success!Apache Spark
Data MigrationAmazon RedshiftApache HadoopPySparkMicrosoft Azure SQL DatabaseAWS GlueExpertKnowledge SynapseApache AirflowData ModelingNoSQL DatabaseDatabricks PlatformScalaApache HiveAzure IoT HuBElasticsearchSQLPython - $30 hourly
- 4.9/5
- (17 jobs)
⭐️⭐️⭐️⭐️⭐️ Prepare to meet your Salesforce Developer expert! 🏆 In the top 1% of all Upwork Talent with 𝐄𝐱𝐩𝐞𝐫𝐭- 𝐕𝐞𝐭𝐭𝐞𝐝 identified through a screening process 🏆 With over 8 years of mastering Salesforce development, I specialize in crafting seamless solutions that harmonize with the Salesforce ecosystem, spanning Sales, Service, Experience, Non-Profit, NPSP, and Marketing Clouds. My skill set extends to Lightning Web Components Development, where I infuse efficiency and innovation into your projects. I can assist with everything from simple configurations to complex Salesforce buildouts and integrations. My experience spans working with companies of all sizes, from 2 users to over 12,000. Expert-Vetted Salesforce Freelancer on UpWork with 8 years of experience in Salesforce & 10,000 hours & 50+ Projects. ✅ Current location Jaipur, India | Specializing in Salesforce Sales Cloud, Service Cloud, Pardot, Community Cloud, Experience Cloud, Non-Profit Cloud, CPQ, QuickBooks, Marketing Cloud, Einstein Analytics, and many more. ✅ Salesforce Development Services:- - Apex Classes - Apex Triggers - Flows - Visual Force pages - LWC Pages - Lightning Aura - Integrations (WSDL / JSON/ XML) - Rest API - Web services - Developing AppExchange Products - J Queries, Java Scripts ✅ Salesforce Consulting Services:- - AppExchange Tools - Administrator and End-user Training Documents - Consulting on moving to Salesforce.com - Go Live Document ✅ Salesforce Administration Services:- - Configuring the Vanilla Salesforce.com - Object/Field Creation - Setting up Users, Roles & Profiles - Creating Flows - Loading data via Data Loader - Various Reports & Dashboards - Chatter Functionality - Data Cleansing - Setting up Campaigns & Email Marketing Tools Why me: ✅ Client-Centered: I prioritize client satisfaction, value, and trust. ✅ Over-delivering: I consistently exceed client expectations. ✅ Responsive: I maintain open communication for efficient service. ✅ Resilient: I tackle and resolve client issues effectively. ✅ Kindness: I treat everyone with respect and aim to improve their situations. #salesforce #sfdc #lightning #marketingcloud #einsteinai #api #certifieddeveloper #salescloud #pardot #upworktopfreelancer #mostseller #toprated #lightning #sales cloud #service cloud #marketing cloudApache Spark
Salesforce CPQJavaScalaSalesforce Marketing CloudSalesforce Sales CloudVisualforceApexBig DataSalesforce Wave AnalyticsRSalesforce CRMSalesforce LightningSalesforce App DevelopmentSalesforce Service Cloud - $30 hourly
- 0.0/5
- (0 jobs)
I'm Gulshan, an Associate Data Engineer with over 3 year of experience in building and managing data pipelines. I specialize in ETL processes, data ingestion, and transformation using Databricks and Azure services. My work focuses on ensuring data is reliable, secure, and accessible for businesses to make informed decisions. I'm also skilled in data governance with Unity Catalog on Databricks, managing data access and security. With expertise in SQL and Python, I automate workflows and optimize data systems, helping organizations get the most out of their data.Apache Spark
Data ModelingData MigrationData LakeAzure Service FabricSQLDatabricks PlatformAzure Cosmos DBAlgorithmsData StructurespandasPySparkData AnalysisData EngineeringPython - $30 hourly
- 0.0/5
- (0 jobs)
🚀 Data Engineering Expert | ETL, Big Data, Cloud (Azure & AWS) I am a Senior Data Engineer with a proven track record in designing scalable ETL pipelines, optimizing big data workflows, and implementing cloud-based data solutions. With expertise in Azure Data Services - Databricks, Data Factory, Synapse Analytics, and ETL tools like IBM DataStage, Apache Airflow, I help businesses turn raw data into meaningful insights. Key Skills: ✅ ETL Development & Data Pipelines (Azure Data Factory, Apache Airflow, IBM DataStage) ✅ Big Data Processing (Spark, Hadoop, Databricks) ✅ Cloud Data Solutions (Azure, AWS) ✅ Data Warehousing & Modeling (Star Schema, Snowflake Schema) ✅ Database Management (SQL) ✅ CI/CD & Automation (Jenkins, Ansible, Docker, Kubernetes) If you’re looking for a reliable, efficient, and highly skilled data engineer to build or optimize your data infrastructure, let’s connect! 🚀Apache Spark
AnsibleDockerPythonSQLApache AirflowApache HadoopPySparkIBM DataStageData WarehousingData EngineeringDatabricks PlatformETL PipelineMicrosoft AzureETL - $30 hourly
- 4.4/5
- (37 jobs)
I am a high proficient Java Developer and Software Engineer with proven expertise in object-oriented analysis and design and exceptional record overseeing all facets of Software Development Life Cycle, from analysis and design to implementation and maintenance. I have 8.3 years of experience in requirement gathering, analysis, conceptualizing, design, development, coding technical solutions using Java/J2EE technology stacks, Java web service, deployment, testing, documentation and maintenance of Java/J2EE web applications. Strong Programming Skills in designing and implementation of multi-tier applications using Java, J2EE, ,ELK stack, kafka, JDBC, JSP, JSTL, HTML, Spring 3/4/5(Annotation and MVC Framework), Hibernate 4/5, ORM framework, Java design patterns, EXT-JS, AngularJS, Jquery, Mobile Jquery JavaScript, Servlets, CSS. Strong experience in Design, build, deploy, maintain and enhance ELK platform.I have experienced in using Elastic search Indices, Elastic search APIs, Kibana Dashboards, Log stash and Log Beats and using or creating plugins for ELK like authentication and authorization plugins.good knowledge of Shell, PowerShell etc. Diverse experience utilizing Java tools in business, Web, and client-server environments including Java Platform, Enterprise Edition (Java EE)), JSP, Java Servlets ,Spring, Hibernate, and Java database Connectivity (JDBC) technologies. Have strong documentation skills and involved in creating technical design documents, project specification documents and Unit Test Plan documents with ISO 9001 quality standards. Effective communicator and worked well with people at different positions such as business partners, technical specialists to end users. Have strong analytical, technical and logical ability to work independently or in a team environment with the ability to coordinate and provide timely business solutions. Strong communication skills, enthusiastic & self-driven, short learning curve and strong team work spirit quality with a high degree of commitment.Apache Spark
LinuxAngular 6ELK StackSplunkApache KafkaWeb ServiceJavaApache CamelSpring BootHibernateDocker - $12 hourly
- 0.0/5
- (0 jobs)
I am data analysis experience in building a creative dashboard to show data. I know python, SQL, EXCEL, Power BI, Numpy, pandas, Apache SparkApache Spark
SeabornMatplotlibPandasAIPython Numpy FastAIApache HadoopMicrosoft Excel PowerPivotMySQLMicrosoft Power BI Data VisualizationData MiningData Analysis Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.
How do I hire a Apache Spark Engineer near Jaipur, on Upwork?
You can hire a Apache Spark Engineer near Jaipur, on Upwork in four simple steps:
- Create a job post tailored to your Apache Spark Engineer project scope. We’ll walk you through the process step by step.
- Browse top Apache Spark Engineer talent on Upwork and invite them to your project.
- Once the proposals start flowing in, create a shortlist of top Apache Spark Engineer profiles and interview.
- Hire the right Apache Spark Engineer for your project from Upwork, the world’s largest work marketplace.
At Upwork, we believe talent staffing should be easy.
How much does it cost to hire a Apache Spark Engineer?
Rates charged by Apache Spark Engineers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.
Why hire a Apache Spark Engineer near Jaipur, on Upwork?
As the world’s work marketplace, we connect highly-skilled freelance Apache Spark Engineers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Spark Engineer team you need to succeed.
Can I hire a Apache Spark Engineer near Jaipur, within 24 hours on Upwork?
Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Spark Engineer proposals within 24 hours of posting a job description.