Hire the best Pyspark Developers in New Delhi, IN

Check out Pyspark Developers in New Delhi, IN with the skills you need for your next job.
  • $60 hourly
    Professional Solution Architect (Java & J2EE) having 20+ years of Development and Software project management experience. Expertise in Data Science and machine learning (Python) . I'm a dedicated and a full time freelancer. Details of Technology Stack: • Analytics : Weka 3.8.xx, Pandas, PySpark, NLTK, PyTorch, Numpy Scipy, Scikit • Development : Java / J2EE / Python • Java EE Web API : Angular, JSF, JSP, Servlet • Java EE resource API : JDBC, JMS, JAX-WS and JTA • Messaging : Active MQ, RabbitMQ, AWS SQS • ORM : Hibernate • Framework : Spring, Spring Boot, Struts, Django • Application Servers : Tomcat, JBoss, Weblogic, WAS • Database : Oracle 11g, MySQL, MS-SQL Server, Amazon Dynamo DB, Mongo DB • Testing Tools : Apache Jmeter, Sahi Pro, Selenium • Build Tools : Jenkins, Maven • Version Control : SVN, IBM Clear Case • Operating System : Linux, Windows • Languages : Java , Python, C/C++, VB • IDE : Eclipse, Netbeans, JDevloper, Visual Studio, PyCharm, Inetlli Jidea • Could Service : AWS includes EC2, Lambda, s3, Glacier, RDS, DynamoDB, Red Shift, Athena, EMR, Glue, IAM, Cognito, SQS, SNS, SES, STS, Route 53, API Gateway, sagemaker. Specialization : Machine Learning, Application Cloud Migration, Server less Application Architecture, AWS Cost optimization.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Hibernate
    PySpark
    Web Application
    Angular
    AWS Lambda
    Docker
    MySQL
    Spring Framework
    Amazon Web Services
    Spring Boot
    Python
    Weka
    Machine Learning
    Java
  • $35 hourly
    Data Engineering & Expert Technology Consultant ✔️ Data Engineering - Airflow, PySpark, Redis, Jenkins, SQL, Kafka ✔️ Machine Learning - Data Analysis (Segmentation, Regression), Ensemble Learning, NLP (SpaCy, NLTK), Clustering ✔️ Data Scraping - Selenium, BS4 ✔️Languages & Databases - Python, Scala, SQL, MongoDB ✔️ Microservices - Flask, Django ✔️ AWS - SNS, S3, Athena, CloudWatch, Glue I have a history of building scalable Software systems, end-to-end Machine Learning solutions, Data Analysis architectures, and providing mentorships on various technologies. You are just a click away from a success story created for your business.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Natural Language Processing
    Microsoft Excel
    Apache Kafka
    ETL Pipeline
    AWS Lambda
    Redis
    Apache NiFi
    Apache Airflow
    MongoDB
    SQL
    PySpark
    Software Development
    Web Scraper
    Python
  • $47 hourly
    Data Engineer with 4+ years of experience in interpreting and analyzing data for driving business solutions. Proficient knowledge in statistics, mathematics, and analytics. Excellent understanding of business operations and analytics tools for effective analysis of data. Development Skills: Language: Python, Javascript Database: SQLite, Postgres, MongoDB Web Framework: Django, Flask, Bottle Operating System: Linux, Windows Libraries: Pandas, Numpy, Plotly, Matplotlib, Selenium, Beautifulsoup Cloud platform: AWS, GCP, Heroku Caching : Redis Projects: Title: User Access Management Role: Developer Stack: Python, Django, Postgres Description: Developed a project which has control over the financial report tool like who can access a particular resource, who can edit it, who can update. All permissions were managed through this application. Projects: Title: Financial Report Builder Role: Developer Stack: Python, Django, Postgres, Celery, Redis Description: Developed a project which generates financial statements like Balance Sheet, Profit, and Loss, Cash Flow Statements. User uploads raw data in excel or CSV format. System Analyse whole data and generates financial reports and Dashboards for clients accordingly. Source Code Management Systems: -- Gitlab -- Github -- Bitbucket Project Management System: -- Asana -- JIRA Development Tools: -- Pycharm -- Jupyter Notebook -- VS Code Deployment Skills: -- Nginx -- Docker -- Apache -- WSGI -- Gunicorn I love to convert client vision into reality. This thing makes me happier. I like brainstorming sessions because it helps me to understand clear requirement and also help in rapid development. Once Requirement is clear next step is better planning and best execution. My goal is 100% client satisfaction during the whole project development. Looking for a project which will challenge my skills and improve my knowledge.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Machine Learning Model
    Data Science
    Docker
    PySpark
    Analytics
    Flask
    Django
    NumPy
    pandas
    Python
  • $25 hourly
    • Worked extensively ( 9+ years ) on Big Data Technologies to implement Spark pipelines using – Apache Spark, Structured Streaming, Kafka, HBase, Hadoop and Hive using Python and Java API’s • Good experience on implementing code using Java Stack and Web Services/API Development – Core Java 1.7, Java 8, Collections, Multithreading, Spring Framework– Core, Boot, MVC, Security, Spring Batch, Hibernate, JAX-RS/WS • Mostly worked on AGILE methodology following sprint-based releases and TDD approach. • Good Hands-on experience in Oracle DB, SQL and UNIX basics along with industry best practices like Jenkins, Sonar, GIT, JIRA.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Apache HBase
    PySpark
    Apache NiFi
    Apache Airflow
    SQL
    Spring Boot
    Apache Hive
    RESTful API
    Apache Hadoop
    Apache Kafka
    Java
    Apache Spark
    Big Data
    Data Engineering
    Python
  • $15 hourly
    I am a highly skilled data engineer with over two years of professional experience and a background in computer science. I have a strong foundation in programming languages such as Python, which I have been using for the past six years. I have completed a number of successful projects using technologies like AWS, Azure, Airflow, and Databricks, and have experience working on ETL development and data pipeline management. In addition, I have completed internships in machine learning and hold a Microsoft certification as an Azure Data Scientist Associate. I am eager to continue my career as a freelance data engineer and am confident that my skills and experience will be an asset to any team. Please do not hesitate to contact me with any questions or assignments to test my abilities. I look forward to the opportunity to work with you.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    PySpark
    Data Visualization
    Web Scraping
    Data Analysis
    NumPy
    Microsoft Power BI
    Microsoft Excel
    Data Science
    pandas
    SQL
    Python
    Tableau
    Machine Learning
  • $15 hourly
    I am a Business Intelligence Developer with experience building data warehouses, ETL pipelines and data visualizations for a banking client. I am experienced in Microsoft Azure, SQL, Python, PowerBi and Azure DevOps. I also have relevant research experience in Machine Learning.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Machine Learning
    SQL Programming
    PySpark
    Microsoft Azure
    Microsoft Power BI
    DevOps
    Microsoft Power Automate
    Data Warehousing
    Azure DevOps
    Apache Spark
    SQL
    Python
    ETL Pipeline
    ETL
  • $30 hourly
    With over 4 years of IT experience, I bring a robust skill set to the table, showcasing proficiency in Python, PySpark, Spark SQL, and a solid understanding of DevOps practices. I hold multiple certifications, including Databricks Data Engineer Professional, Microsoft DP 203, DP 900, AZ 900, and AI 900, validating my expertise as a data engineer with over 3 years of hands-on experience on the Microsoft Azure cloud platform. Furthermore, my certification as a SAFe 5 Agile Practitioner underscores my commitment to agile methodologies and practices.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Code Review
    Jira
    Databricks Platform
    Data Cleaning
    PySpark
    PostgreSQL
    Microsoft Azure
    ETL
    Azure DevOps
    Python
    ETL Pipeline
    Apache Spark
  • $10 hourly
    Full stack Data Engineer Analyst End to end data gathering and analysis. Introduction A experimenter, fast learner with the urge to learn things with full dedication. Keen to build things that make processes easier while maintaining a good quality. A team leader with a goal to extract the best out of everyone.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    pandas
    Amazon S3
    AWS Lambda
    Analytics Dashboard
    React
    Docker
    Linux
    SQL
    Web API
    Metabase
    Amazon Redshift
    AWS Glue
    ETL
    PySpark
    Python Script
  • $30 hourly
    I am a Data Analyst by choice. Having expertise on different kind of jobs related to analytics. I can manage end to end project. Understand Azure and AWS cloud environment.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Project Management
    Project Delivery
    Flask
    Seaborn
    Matplotlib
    pandas
    NumPy
    Python
    PySpark
    Microsoft Power BI
    Data Scraping
  • $25 hourly
    Experienced Data Engineer who is an AWS Ceritified Solutions Architect, having 6+ years of experience in serving 5+ Fortune 500 clients. I am proficient in- 1. SQL 2. Data Analytics 3. Python 4. PySpark 5. AWS 6. Cloud Ops 7. Scala 8. Apache Airflow
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Apache Spark
    Apache Hadoop
    PySpark
    Azure
    Amazon Athena
    Excel VBA
    Data Visualization
    Microsoft Power BI
    Tableau
    SQL
    Data Analysis
    AWS Glue
    ETL Pipeline
    Python
    Machine Learning
  • $4 hourly
    I am Data Engineer with 4 years of experience in data extraction, data transformation and data migration.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Big Data
    Hive
    SQL
    PySpark
    Databricks Platform
    Microsoft Azure
    Apache NiFi
    Cloudera
    Apache Hadoop
    Python
  • $5 hourly
    I'm an Azure Data Engineer with rich experience in building data platforms and providing end to end data engineering solutions. I am highly proficient in tools such as Azure Data Factory , Data Flow, Databrick, SSMS, Mysql. I'm open to embark on a collaborative journey, delivering bespoke data solutions that empower businesses and elevate data-driven decision-making. - Azure Data Engineer with a passion for transforming data into actionable insights. - Proficient in designing, implementing, and optimising data pipelines on the Azure platform. - Expertise in building scalable and high-performance data solutions using Azure services. - Skilled in data integration, ETL processes, and data warehousing. - Experienced in architecting data lakes, data warehouses, and real-time analytics solutions. - Adept at ensuring data security, quality, and compliance throughout the data lifecycle. - Strong problem-solving skills with a focus on delivering efficient and effective data solutions. - Collaborative team player with a drive to empower organisations through data-driven decisions. - Continuous learner, keeping up-to-date with the latest Azure technologies and trends. - Excited to contribute to projects that harness the power of Azure to unlock data's full potential.
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    PySpark
    Microsoft SQL Server
    Data Warehousing & ETL Software
    Data Ingestion
    Data Extraction
    Data Curation
    Data Cleaning
    Data Analytics
    Databricks Platform
    Microsoft Azure
    Database Design
    ETL Pipeline
    Python
    Data Transformation
    Data Engineering
  • $16 hourly
    Big Data Engineer with a passion for transforming raw data into actionable insights. - Proficient in ETL, Data Pipelines, Machine Learning, SQL, Python, Scala, Hadoop, Spark, Kafka, Airflow, AWS and Power BI. - Ex-DEI @Amazon | Ex-Computer Science Instructor. - Feel free to reach out, whether it's work-related or not :))
    vsuc_fltilesrefresh_TrophyIcon Pyspark
    Microsoft Power BI
    Apache Airflow
    Apache Kafka
    PySpark
    Apache Hadoop
    Amazon Web Services
    JavaScript
    CSS
    HTML
    C++
    C
    Java
    Scala
    Python
    SQL
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by 5M+ businesses

How do I hire a Pyspark Developer near New Delhi, on Upwork?

You can hire a Pyspark Developer near New Delhi, on Upwork in four simple steps:

  • Create a job post tailored to your Pyspark Developer project scope. We’ll walk you through the process step by step.
  • Browse top Pyspark Developer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Pyspark Developer profiles and interview.
  • Hire the right Pyspark Developer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Pyspark Developer?

Rates charged by Pyspark Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Pyspark Developer near New Delhi, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Pyspark Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Pyspark Developer team you need to succeed.

Can I hire a Pyspark Developer near New Delhi, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Pyspark Developer proposals within 24 hours of posting a job description.