Hire the best Apache Spark Engineers in Ahmedabad, IN

Check out Apache Spark Engineers in Ahmedabad, IN with the skills you need for your next job.
Clients rate Apache Spark Engineers
Rating is 4.7 out of 5.
4.7/5
based on 283 client reviews
  • $70 hourly
    I have 12+ years of hands on in Big data technology based on Scala Framework. Designed Architecture for several projects. Also, I'm proficient in Akka , spark and this technology has currently a priority in my area of interests. Experienced multiple Domains. My Skill set include - Scala, AWS, Angular 2+ Scala stack: - Akka - Play Framework - Spray (Akka IO) - Spark - Play Env tools: - PostgreSQL / MongoDB / ElasticSearch - RabbitMQ - Kafka Front-end technologies: - Angular 2+ I am GCP certified and have hands-on experience with multiple cloud providers, including GCP, AWS, and OpenShift. I have been working on AWS services for several years, and my recent projects include: Building Terraform modules to build CI/CD pipelines with AWS CodePipeline 1. Designing and developing software applications 2. Automating and managing infrastructure with Terraform 3. Deploying and operating applications in the cloud 4. Working with a variety of cloud technologies, including AWS, GCP, and OpenShift I am passionate about security and cost optimization, and I believe this sets me apart from other consultants. I am always looking for ways to improve the security of my client's systems and reduce their costs. I am a highly skilled and experienced IT professional with a passion for cloud computing. I am confident that I have the skills and knowledge necessary to be a valuable asset to your team. My focus is on building practical business solutions and looking for opportunities to expand my knowledge and take part in projects that present a challenge. Dedicated to exceeding your expectations with highest-quality solutions, delivered on time and to your precise needs.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Angular 4
    Scala
    Akka
    Apache Cassandra
    Apache Kafka
    MongoDB
  • $40 hourly
    I am Aliabbas Bhojani, Data Engineer with profound knowledge and experience in the core functionality of Data Engineering, Big Data Processing and Cloud Data Architecture. I have completed by Bachelor in Engineering with the specialisation in Computer Engineering which has helped me to target complex data problems and have proved my expertise by suggesting the high performant cloud data architecture which can help to scale the business. I'm very familiar with a wide variety of web platforms and infrastructure, so don't be afraid to run something by me for things like Apache Spark, Apache NiFi, Kafka, Apache Accumulo, Apache Base, Zookeeper, REST APIs, Java, Python, Scala and JavaScript. I can work on your on-prem or cloud deployed solution so whether it's setting up Kubernetes, Docker, VMs on Azure, Amazon Web Services(AWS) or Google Cloud Platform(GCP). Wide Spectrum of Offering: - Data Engineering Core Values - Data Driven Business Intelligence - Automated Real Time Data Pipelines - Advance Machine Learning based Data Analytics - Relational and Non Relational Data Modelling - Cloud native data products - Big Data Handling with Apache Spark and Apache NiFi - Open Source Data Tools Usage and Mindset - AWS Cloud Data Architecture and Engineering - Azure Cloud Data Architecture and Engineering - GCP Cloud Data Architecture and Engineering - Scaling Data Pipelines with Kubernetes and Docker - No Down Time Data Pipeline using Cloud Agnostic Approach Feel free to reach out in terms of any inquiries and project discussion Aliabbas Bhojani
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Snowflake
    Cloud Architecture
    Data Lake
    Apache Accumulo
    ETL
    DevOps
    Machine Learning
    PySpark
    Apache NiFi
    Python
    Java
    SQL
    Data Engineering
    Apache Hadoop
  • $30 hourly
    I love data and statistics. I came to the data science and analysis field while I was doing my Btech in computer science. I found it very interesting how data can help organizations to make better decisions for their business operations. Ever since I was a little kid, I always tried to find patterns in different things. Be it a simple card game or a complex video game. Following this interest, I came across the field of Data Science, which I found perfect for me. I have pursued my B.Tech in Computer Science with a specialization in Big Data Analytics offered by IBM. My strong area of work is data engineering and data analytics. Tools and technologies in which I have proficiency : Data Exploration and Preprocessing * Python, Scala * PySpark, Apache Spark * Linux command-line tools like CSVKit * SQL, MongoDB, Redis Business Intelligence Tool * PowerBI * Tableau * Google DataStudio * IBM Cognos BI * Apache Superset The domains I have worked with, * Finance * Logistics industries * Pharma Have worked on 40+ data science and engineering projects. My contributions to those projects mainly included, - Building data architecture for the project with a mixture of the right open-source tools as well as cloud tools. - Designing data warehouses and developing robust ETLs for continuous data availability. - Building scalable machine learning models ( mainly for the domain of user behavioral analysis). - Building NLP models to serve problems like daily news insights, chatbots, and prescriptive analytics reports. - Developing visualization reports in different BI tools which helps in the decision-making process while adding value to the daily operations and business matrices.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Microsoft Power BI
    Machine Learning
    Data Engineering
    Linux
    Apache Spark MLlib
    Big Data
    MongoDB
    Apache OpenNLP
    PySpark
    Web Scraping
    ETL Pipeline
  • $12 hourly
    AWS Certified Data Engineer | PySpark, Glue, ETL Pipelines, and Data Warehousing Expert Hello! I'm a certified AWS Data Analytics professional with expertise in designing and building scalable ETL pipelines using PySpark, AWS Glue and Databricks. I specialize in data engineering across various domains, leveraging powerful data warehouses like Redshift and relational databases such as MySQL and PostgreSQL. With strong programming skills in Python, I help businesses turn raw data into actionable insights. Key Skills: PySpark and Glue for seamless big data processing Redshift, MySQL, and PostgreSQL for data storage and optimization Python for data transformation and automation Advanced ETL pipeline development Data Migrating using DMS With hands-on experience across industries and a focus on efficiency and scalability, I am ready to help you achieve your data goals. Let’s work together!
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Apache Airflow
    Amazon Athena
    Amazon S3
    Data Engineering
    Amazon QuickSight
    Data Migration
    Databricks Platform
    PySpark
    Amazon Redshift
    Python
    PostgreSQL
    Data Analysis
    SQL
    AWS Glue
  • $15 hourly
    Key Skills & Expertise: # Data Engineering: Proficient in ETL pipeline development, data modeling, and real-time analytics using PySpark, Apache Spark, and Scala. # Cloud Solutions: Skilled in AWS (S3, Glue, Redshift, Lambda, CloudWatch) and Microsoft Azure for building robust, cloud-based data architectures. # Big Data Technologies: Hands-on experience with Hadoop, HDFS, Apache Hive, and Kafka for managing high-volume data. # Database Management: Expert in SQL, MongoDB, and Aurora with strong optimization skills. # Project Delivery: Adept at end-to-end project management, including client requirements gathering, agile methodologies, and unit testing. Notable Achievements: # Designed and implemented a real-time cost analysis system for a petrochemical client, optimizing over 450 data models using Hive and PySpark. # Automated data pipelines for insurance billing eligibility, increasing processing efficiency by 40%. # Architected a comprehensive ETL framework for a leading Vietnamese bank, enabling seamless data transformations and validation in AWS. Certifications: AWS Cloud Practitioner AWS Data Engineer Associate Why Work With Me? I am committed to delivering high-quality solutions tailored to your business needs. With a proven track record of improving data workflows, enhancing reporting accuracy, and scaling data platforms, I’m ready to help you turn your data challenges into opportunities. Let’s Collaborate! Are you looking for a data expert who can seamlessly integrate technical proficiency with business acumen? Let’s discuss how I can add value to your project!
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    ETL Pipeline
    Machine Learning
    NoSQL Database
    Jira
    AWS Lambda
    Amazon S3
    Amazon QuickSight
    Terraform
    AWS Glue
    Amazon Athena
    PySpark
    Hive Technology
    Python
    SQL
  • $30 hourly
    I am a Software Developer interested in Distributed Computing and love to solve Big data Problems using Spark, Pyspark, Scala, Databricks and other modern data Technologies.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Amazon S3
    PySpark
    Apache Airflow
    AWS Glue
    Scala
    Databricks Platform
    Hive
    Python
    SQL
    ETL Pipeline
  • $20 hourly
    Hi there! I'm a passionate and results-driven Data Engineer and Data Consultant with a knack for transforming raw data into actionable insights. With a blend of technical expertise and analytical prowess, I thrive on solving complex data challenges and empowering businesses to make data-driven decisions. I specialize in designing, building, and maintaining data pipelines, ensuring that your data is collected, processed, and stored efficiently. Whether it's structured or unstructured data, I'm your go-to expert in creating robust data architectures.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    React
    Rust
    Flask
    Data Warehousing & ETL Software
    Data Modeling
    Data Analytics
    Python
    Scala
    Apache Kafka
    AWS Glue
    Databricks Platform
    Apache Airflow
    ETL Pipeline
    SQL
  • $20 hourly
    Cloud Data Architect with 10+ Year of experience * Programming: Scala, Java, Python * Web Scraping: Selenium + Beautifulsoap * Big data Technology: Hadoop, Spark, Hive, Impala, HBase * Streaming Technology: Kafka, Nifi , Spark Streaming, Kafka-Connect, Kafka Streaming, Kafka SQL, Kafka Rest Proxy, IBM MQ, Kafka Monitoring * Reporting: Tableau, Kibana , Grafana * DB Technologies: Teradata, Greenplum, SQL Server, MySQL, Mongo DB, ElasticSearch (ELK) * Accomplishments: - Implementing a data warehousing solution that enables fast and accurate retrieval of data for business intelligence and analytics. - Developing and deploying data analytics and machine learning models in production. - Gold Medalist
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    AWS Lambda
    ETL Pipeline
    Hive
    Apache Druid
    Data Engineering
    Apache NiFi
    Amazon Redshift
    Kubernetes
    AWS Glue
    Apache Hadoop
    Elasticsearch
    SQL
    Apache Kafka
    Python
  • $5 hourly
    Data Cleaning Processes : spark and pyspark technologies Data ETL Processes : spark and pyspark technologies and also worked with azure data fectory for etl processes Data Profiling : I can crate a structure for avalable Data based on the requiremets of the project NoSql DataBase : accumulo and exploring Hbase and HIVE Data Migration : Transfer Data from one database to another with proper logic and without any type data loss SQL DataBase : currently working on Microsoft Server Sql and can work on MY SQL Database ML : I have completed a cource in ML from udemy and worked on several models.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Java
    NoSQL Database
    Machine Learning
    PySpark
    Big Data
    Scala
    Python
    SQL
    ETL
  • $20 hourly
    I am experienced Odoo Developer with more then 5+ years experienced, Odoo Development, Odoo Customization, Odoo Implementation, Odoo Studio and other Odoo related Services. I am passionate about development, interact with clients and successfully deliver projects. 👉 My Expertise : - More than 5+ years senior Odoo developer (v12 to v.17.0) - Odoo community and enterprise vision - Odoo functional seeings and configuration - Odoo.sh, AWS, Azure, Google cloud, OVH and other self hosted services - Odoo Sales & CRM Implementation - Odoo Point of Sales Implementation - Odoo Manufacturing Implementation - Odoo Purchase Management Implementation - Odoo Inventory Management Implementation - Odoo Accounting Implementation - Odoo E-Learning Implementation - Odoo E-commerce Implementation - Odoo HR & Payroll Implementation - Odoo Custom Module Development - Odoo integrated Android Application - Odoo integrated iOS Application 👉 API Integration with Various Platforms - Facebook Leads Integration - Google Leads Integration - Amazon Integration - Payment gateways Strip, Mangopay, Razorpay 👉 Shipping Methods - FedEx - DHL - UPS 👉 Odoo Data migration and implementation of Odoo. - Migration of Microsoft Dynamics 365 Business Central to Odoo - Migration of Sage to Odoo - Migration of Quick book to Odoo - Migration of Xero to Odoo - Migration of SugarCRM to Odoo - Migration of Salesforce to Odoo - Database Migration. - Odoo Electronics invoice integration - Create/modify email templates, WEB templates - Odoo Shopify connector development - Odoo Website portal - Design, code, test, debug, and document software according to the functional requirements - Server Management, Install Odoo / Create Configuration and Service files. - GitHub repo management for better code review - PostgreSQL, Odoo database, Queries Odoo Multi-company settings, and multi-company management. - Integration with third party apps / external systems / mobile applications. - Translation for Odoo system with PO file. - Training for end users - Odoo Consultations For Odoo projects, I'm excited to provide you with the required technique to success in your business. I believe that this will result in increased sales and expand your company's reach through marketing. Let's take free consultation right into our conversation about your project without delay.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    AWS Cloud9
    Ubuntu
    Odoo Administration
    HTML5
    jQuery
    JavaScript
    SQL
    Python
    Odoo Development
    Odoo
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Spark Engineer near Ahmedabad, on Upwork?

You can hire a Apache Spark Engineer near Ahmedabad, on Upwork in four simple steps:

  • Create a job post tailored to your Apache Spark Engineer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Spark Engineer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Spark Engineer profiles and interview.
  • Hire the right Apache Spark Engineer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Spark Engineer?

Rates charged by Apache Spark Engineers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Spark Engineer near Ahmedabad, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Spark Engineers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Spark Engineer team you need to succeed.

Can I hire a Apache Spark Engineer near Ahmedabad, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Spark Engineer proposals within 24 hours of posting a job description.