Hire the best Apache Spark Engineers in Delhi, IN

Check out Apache Spark Engineers in Delhi, IN with the skills you need for your next job.
Clients rate Apache Spark Engineers
Rating is 4.7 out of 5.
4.7/5
based on 283 client reviews
  • $30 hourly
    A software professional with 5 years of experience in Design, Coding, Testing and Production management using agile methodology. ● Expertise in Multiple Bigdata Technologies i.e. Apache Spark Scala, Data pipeline, Hortonworks, Azure Databricks, Azure Data Factory, Azure Hdinsights, Apache Hadoop HDFS, MapReduce, Hive. ● Working Knowledge in Azure SQL DW, Azure Data Analysis Service and Reporting tool. ● Extensively worked on Data extraction, Transformation, Cleansing, Loading and Reporting the output data. ● Experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions. ● Extensively worked on Data extraction, wrangling, Transformation, Cleansing, Loading, Real Time Analytics and Reporting the output data. ● Experience in performance optimization, tuning, memory management and process improvements & enhancements on various levels. ● Having On-Site experience of 8 months. ● AZ-900: Microsoft Azure Fundamentals Certified. ● DP-200, DP-201: Microsoft Azure Data Engineer Associate Certified.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Data Warehousing
    Chatbot
    Dashboard
    Big Data
    Mixpanel
    Azure Cosmos DB
    Microsoft Power BI
    Databricks Platform
    PySpark
    ETL
    ETL Pipeline
    SQL
  • $100 hourly
    Dev completed his bachelor's degree from SRM Institute of Science and Technology, Chennai, and pursued it in the field of Electronics and Communication Engineering. He has always been a data enthusiast and is able to relate to the field of Analytics with his attention to detail and orderliness when it comes to doing things. He has worked in Pharma Analytics for forecasting and projecting drugs sales and usage and now works with Jio as a Data Engineer. He distributes his time for work, reading, family, social welfare and his young Golden Retriever called Winter.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Tableau
    Microsoft Power BI
    Microsoft Azure
    Apache Hadoop
    Google Cloud Platform
    AWS Development
    Apache NiFi
    Python
    SQL
    Apache Kafka
    Microsoft Excel
    Scala
    Visualization
    Data Analytics
  • $100 hourly
    I am a seasoned Senior Data Engineer with a passion for crafting robust data solutions that drive actionable insights and facilitate informed decision-making. With a solid background in data engineering, I excel in designing, developing, and maintaining scalable data pipelines, ensuring the smooth flow of data from various sources to its destination. Key Skills: Python Pyspark Spark Hadoop Hive HDFS Cloudera Data Engineering ETL Development Data Warehousing Big Data Technologies (Hadoop, Spark) Database Management (SQL, NoSQL) Data Modeling and Optimization Cloud Platforms (AWS, Azure, GCP) Stream Processing Data Quality Assurance Data Governance
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Cloudera
    Apache Impala
    YARN
    Amazon S3
    Teradata
    Informatica
    Apache Hadoop
    Hive
    PostgreSQL
    Microsoft SQL Server
    SQL
    Apache Solr
    PySpark
    Python
  • $20 hourly
    DevOps Engineer having 3 years of experience in AWS/Azure with professional attitude who works with dedication and always ready to accept challenges. I have worked on various projects with international clients and diverse teams with One-Team model following agile methodologies like Scrum and Kanban. My aim is to lead innovative projects and deliver best services in Cloud Computing globally. I have experience in: 1. Writing Infrastructure as Code and provisioning it in AWS cloud using CI/CD practice. 2. Building CI/CD pipelines to deploy code artifacts from different code versioning tools e.g. Github. 3. Building and deploying docker images to AWS/Azure cloud. 4. Deploying Lambda and API Gateways using Serverless Framework. 5. Executing Infrastructure deployment in multi AWS/Azure accounts using Service Catalog Portfolio and Products strategy and Assume-Role functionality via CI/CD pipeline stages. 6. Performing custom processing in Infrastructure scripts. AWS Cloud Skills set: • Creation of IAM Users, Groups, Roles, cross account assume roles, Policies and fine grained access control • EC2, Bastion Host and Security Groups • S3 buckets, bucket policies and triggers, cross account bucket access • Lambda, serverless computing, lambda packages and deployment • ECS clusters, tasks definitions, services, ecs fargate • CloudFormation templates, stack creation and updation • Creation of VPC, Subnets, Routes tables, NACL, NAT gateways, internet gateways, VPC Endpoints and peering • RDS clusters and instance creation, schema deployment into RDS • CloudWatch metrics, events, rules and alarms • SNS notifications, topics and subscription • SQS queues creation and trigger polices • Auto Scaling with simple and step scaling policies • Creation of Elastic, Application and Network Load Balancers and end-to-end secure communication with SSL certificates • Code Pipeline, Code Commit, Code Build, Code Deploy Azure Cloud Skills Set: • Azure Active Directory (AAD) management, including the creation of users, groups, roles, conditional access policies, and application registrations. • Virtual Machines (VMs) deployment, Azure Bastion configuration, and Network Security Groups (NSGs). • Azure Storage, including Blob storage, Blob storage access control, and storage account configurations. • Azure Functions for serverless computing, function packages, and deployment strategies. • Azure Container Service (ACS) and Kubernetes Service (AKS) for container orchestration, including cluster management, pod deployment, and service configurations. • Azure Resource Manager (ARM) templates for infrastructure as code, creating and updating resource stacks. • Virtual Network (VNet) setup, Subnet creation, Route tables, Network Security Groups (NSGs), NAT gateways, Azure Bastion, VNet Peering, and ExpressRoute. • Azure SQL Database creation, management of database clusters and instances, and schema deployment. • Azure Monitor for monitoring metrics, configuring alert rules, and creating action groups. • Azure Service Bus for messaging, including topics, subscriptions, and queue creation. • Azure Autoscale for automatic scaling based on defined policies. • Azure Load Balancer configurations for both internal and external load balancing, including SSL certificate setup for secure communication. • Azure DevOps for continuous integration and continuous deployment (CI/CD), utilizing Azure Repos, Azure Pipelines, and Azure Artifacts DevOps tools Proficiencies: • Infrastructure Automation: Terraform, ARM, CloudFormation, Serverless Framework • CI/CD Implementation: Terraform Cloud, Jenkins, CodePipeline, Gitlab, Circle CI • Containerization & Orchestration: Docker, ECS, Kubernetes • Monitoring & Logging: Cloudwatch,Datadog, Log analytics, Aplication insights • Scripting & Automation: Python, Bash Agile Methodologies: • Git • Scrum • Jira / Kanban • Slack
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Azure DevOps
    Vault by HashiCorp
    DevOps
    AWS Development
    Prometheus
    Terraform
    Apache Airflow
    Kubernetes
    CI/CD
  • $45 hourly
    Are you looking for a cooperative, happy, hard-working and reliable contractor on Upwork? If yes, then you are at the right place. I have hands on experience of latest Internet Marketing Techniques, Social Media Optimization and Online Reputation Management(ORM). I am always ready to provide ethical and dedicated services to my clients; I am seeking opportunities to enhance your business profits by implementing following online marketing skills. • SEO (on page) • Link Building • Google Keyword Planner • Website SEO Audit • Google Organic Ranking • Keywords Analysis • White Hat SEO • Landing Pages Analysis • Social Media Optimization • Google Business Listing Set Up and Promotion • Set Up Google Analytics & Webmaster Tool • Adding Products, Descriptions • Competition Research & Analysis Happy to assist you.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    SEO Keyword Research
    Mathematical Modeling
    WordPress
    SEO Backlinking
    Data Analysis
    Visual Basic for Applications
    Statistical Computing
    SEO Audit
    Flask
    Data Science
    Machine Learning
    Data Mining
    pandas
    Python
    Chatbot
  • $5 hourly
    A thinker ,Problem solver ,Developer having the skills and expertise in software development (mainly with HTML,CSS ,Javascript ,Angular ,Nodejs and .NET) and Data Engineering (With python ,spark , AWS ,GCP , Snowflake , DBT , Fivetran)
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Node.js
    HTML
    CSS 3
    Amazon Web Services
    Software Development
    CSS
    Angular
    C++
    JavaScript
    ETL
    Python
  • $20 hourly
    Presently a Data Engineer with more than 4 years of hands-on experience in building data pipelines solutions for complex business problems involving large scale data processing and real-time analytics. Certified Google Cloud Data Engineer with a demonstrated history of working in the information technology and services industry. Skilled in Data Engineering, Hadoop, SQOOP, Hive, HBase, Spark, GCP(Dataflow,BigQuery,Pub/Sub,Cloud Storage), Python ,Scala, SQL and Databricks. Involved in Developing & Designing Big Data Systems to crunch large volumes of sales data and get insights which can drive decision making using different tools and technologies. I would welcome the opportunity to have an interview and discuss further and I look forward to hearing from you!
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Databricks Platform
    BigQuery
    Google Dataflow
    Apache Beam
    Scala
    Sqoop
    Hive
    Apache Hadoop
    Google Cloud Platform
    SQL
    Java
    Data Engineering
  • $15 hourly
    ProfileSummary ? Working as a Senior Software Developer at Walmart with 10+ years of overall experience which includes 5+ years of extensiveexperienceinbigdatadevelopment. ? ExperienceinHadoopecosystem-HDFS,Spark,Scala,Hive, Sqoop,SparkSQLetc. ? Excellent team player, quick and self learner with detail oriented personality and excellent communication skills, having10+yearsoftotalexperienceinITindustry. ? ExperienceinSparkandScalaprogramminglanguagewith goodknowledgeoftheSparkarchitecture. ? ExperiencewithCICDandJobOrchestrationusingJenkins, ArtifactoryandAirflow.WorkedonPythonandBashscripts forAirlfow. ? Managed the data ingestion from RDBMS to Hadoop file systemusingSqoop. ? Used partitioning, bucketing, etc in Hive to get better performance. ? Worked on NoSQL databases such as Elasticsearch (ELK stack)andHBase. ? WorkedonSparkstreamingusingSparkSQLengine ? WorkedonPOCforApacheKafkawithSparkstreaming
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Tableau
    Apache Kafka
    Kibana
    Elasticsearch
    Java
    Google Cloud Platform
    HDFS
    Apache Hadoop
    Microsoft Azure SQL Database
    Hive
    Scala
  • $20 hourly
    Professional Summary: Dedicated and versatile professional with expertise in natural language processing (NLP), machine learning, and software development. Skilled in building and deploying AI-powered solutions to solve complex business problems. Known for a proactive approach to problem-solving and a strong commitment to delivering high-quality results. Key Skills: Natural Language Processing (NLP) Machine Learning Deep Learning Software Development Python, TensorFlow, PyTorch Data Analysis and Visualization Cloud Computing (AWS, Azure, Google Cloud) Agile Methodologies
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    ELK Stack
    Natural Language Processing
    Deep Learning
    Machine Learning
    Data Science
    Docker
    Apache Kafka
    Python
  • $48 hourly
    Expert in Apache Spark Apache Spark MLlib Scala Apache Hadoop Python Java PySpark Hive Apache Kafka SQL FPGA Big Data
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Big Data
    FPGA
    SQL
    Apache Kafka
    Hive
    PySpark
    Java
    Python
    Apache Hadoop
    Scala
    Apache Spark MLlib
  • $5 hourly
    Software Developer I'm a Software Engineer with 3 yrs. of professional experience in Backend Engineering and working experience of FrontEnd. Proficiency and problem solving using Java/J2EE have also basics key skills of Data Engineering and Cloud services. Apart of this, I like to accept challenges and eager to explore technical stuff. PERSONAL PROJECTS Architect the application using Traditional java EE approach utilized Jsp for UI l, MySQL for DB and Back-end on Java EE on tomcat server. Empowering job seekers to apply For job and providers to create company account and post jobs listing. Developed web app for students management system to manage the meta data of students. Used React, Mui, Typescript for UI and Springboot for API with highly security with spring security. Developed Tic Toc Toe Game App using react with MUI, showcasing proficiency in Frontend development.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Product Development
    Software
    Full-Stack Development
    Cloud Application
    Real Time Stream Processing
    Chat & Messaging Software
    MySQL
    Spring Boot
    Java
    React
  • $10 hourly
    As a seasoned Data Engineer, I specialize in crafting diverse big data solutions tailored to medium and large-scale operations. My expertise spans across various domains, including Sales/Booking, Inventory management, and Clickstream analysis. Additionally, I possess hands-on proficiency in developing comprehensive employee tracking systems leveraging CRM data. Furthermore, my skill set extends to encompass a deep understanding of geographical data and its applications in businesses, along with a knack for integrating brands post-acquisitions.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Apache Hive
    SQL
    Data Warehousing & ETL Software
    Data Analytics & Visualization Software
    API Development
    PySpark
    Python
    Java
    Apache Airflow
    Data Engineering
    ETL
    Big Data
  • $17 hourly
    Working in the field of data engineer. I have total experience of 5 years . My tech stack includes python ,azure cloud ,data bricks,Apache spark,big data,pandas ,SQL ,Excel
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Microsoft Power BI
    Git
    Excel Formula
    Azure App Service
    SQL
    Big Data
    Databricks Platform
    Python
  • $20 hourly
    I am a highly skilled and results-driven ML engineer specializing in Natural Language Processing (NLP). With a strong foundation in Spacy and expertise in data engineering, I excel in designing and implementing robust data pipelines. My proficiency extends to DevOps practices, ensuring seamless integration of machine learning models into production environments. Skills and Strengths: NLP Expertise: Proven experience in developing and deploying NLP models using Spacy, achieving accurate and efficient natural language understanding. Data Engineering: Proficient in building end-to-end data pipelines, from data collection and preprocessing to model training and deployment, ensuring a streamlined and scalable workflow. DevOps Integration: Adept at incorporating machine learning models into production systems through effective collaboration with DevOps teams, ensuring smooth deployment and maintenance. Problem Solving: Demonstrated ability to analyze complex problems, devise innovative solutions, and implement them effectively to meet project objectives. Accomplishments: Implemented NLP Solutions: Successfully developed and deployed NLP solutions for various applications, improving language understanding and enhancing user experiences. Optimized Data Pipelines: Streamlined data pipelines, resulting in improved efficiency, reduced latency, and enhanced overall system performance. Collaborative DevOps: Worked closely with DevOps teams to integrate machine learning models into CI/CD pipelines, ensuring rapid and reliable deployment. Named Entity Recognition (NER) System: Developed a robust NER system using Spacy, achieving state-of-the-art results in entity extraction. In summary, I bring a unique blend of NLP expertise, data engineering skills, and DevOps integration experience. My proven track record in delivering successful projects and my commitment to staying at the forefront of industry advancements make me a valuable asset for any team or project.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Magento
    MySQL Programming
    Linux
    Apache Airflow
    Kibana
    Machine Learning
    MLflow
    Natural Language Processing
    Apache NiFi
    Kubernetes
    Docker
    Redis
    Azure DevOps
    Git
  • $30 hourly
    I am a developer with 9+ of experience in Data, Software and Devops Engineer. I know many technologies which helps in ETL operation. I closely worked with Microsoft Team and Databricks Team.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Terraform
    Apache Airflow
    Java
    Python
    Scala
    SQL Programming
    Excel Formula
    Jenkins
    Hive
    Databricks Platform
    PySpark
    Software
    DevOps
  • $3 hourly
    Jatin Jindal (Data Engineer at Times Internet) An enthusiastic and hard-working Individual with strong Data Engineering skills and more than 4.5 years of experience in the field of Data Engineering.
    vsuc_fltilesrefresh_TrophyIcon Apache Spark
    Hive
    Python Script
    Apache Kafka
    Golang
    Apache Hadoop
    Python
    Elasticsearch
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Spark Engineer near Delhi, on Upwork?

You can hire a Apache Spark Engineer near Delhi, on Upwork in four simple steps:

  • Create a job post tailored to your Apache Spark Engineer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Spark Engineer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Spark Engineer profiles and interview.
  • Hire the right Apache Spark Engineer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Spark Engineer?

Rates charged by Apache Spark Engineers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Spark Engineer near Delhi, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Spark Engineers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Spark Engineer team you need to succeed.

Can I hire a Apache Spark Engineer near Delhi, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Spark Engineer proposals within 24 hours of posting a job description.