Hire the best Pyspark Developers in Gurgaon, IN
Check out Pyspark Developers in Gurgaon, IN with the skills you need for your next job.
- $25 hourly
- 4.9/5
- (7 jobs)
Hello, I’m Aditya Johar, a Data Scientist and Full Stack Developer with 9+ years of experience delivering innovative, tech-driven solutions. I focus on identifying areas where technology can reduce manual tasks, streamline workflows, and optimize resources By implementing smart automation solutions tailored to your specific needs, I can help your business cut costs, improve efficiency, and free up valuable time for more strategic, growth-focused initiatives. ---------------------------------TOP SOLUTIONS DEVELOPED--------------------------------- ✅Custom Software using Python (Django, Flask, FAST API), MERN/MEAN/MEVN Stacks ✅Interactive Data Visualization Dashboards - Power BI, Tableau, ETL etc ✅Intelligent Document Processing (IDP), RAG, LLMs, Chat GTP APIs ✅NLP: Sentiment Analysis, Text Summarization, Chatbots and Language Translation ✅COMPUTER VISION: Image and Video Classification, Object Detection, Face Recognition, Medical Image Analysis ✅RECOMMENDATION SYSTEMS: Product Recommendations (e.g., e-commerce), Content Recommendations (e.g., streaming services), Personalized Marketing ✅PREDICTIVE ANALYTICS: Sales and Demand Forecasting, Customer Churn Prediction, Stock Price Prediction, Equipment Maintenance Prediction ✅E-COMMERCE OPTIMIZATION: Dynamic Pricing, Inventory Management, Customer Lifetime Value Prediction ✅TIME SERIES ANALYSIS: Financial Market Analysis, Energy Consumption Forecasting, Weather Forecasting ✅SPEECH RECOGNITION: Virtual Call Center Agents, Voice Assistants (e.g., Siri, Alexa) ✅AI IN FINANCE: Credit Scoring, Algorithmic Trading, Fraud Prevention ✅AI IN HR: Candidate Screening, Employee Performance Analysis, Workforce Planning ✅CONVERSATIONAL AI: Customer Support Chatbots, Virtual Shopping Assistants, Voice Interfaces ✅AI IN EDUCATION: Personalized Learning Paths, Educational Chatbots, Plagiarism Detection ✅AI IN MARKETING: Customer Segmentation, Content Personalization, A/B Testing ✅SUPPLY CHAIN OPTIMIZATION: Demand Forecasting, Inventory Optimization, Route Planning And Many More use cases that we can discuss while we connect. "Ready to turn these possibilities into realities? I'm just a click away! Simply click the 'Invite to Job' or 'Hire Now' button in the top right corner of your screen."Pyspark
DjangoApache AirflowApache HadoopTerraformPySparkApache KafkaFlaskBigQueryBERTApache SparkPython Scikit-LearnpandasPythonTensorFlowData Science - $35 hourly
- 5.0/5
- (14 jobs)
Is your data delivering the insights your business depends on? Curious about the untapped potential within your data? From identifying new revenue streams to optimizing operations, the right data insights can transform decision-making across your business. I deliver solutions that combine technical expertise with distinctive design sensibility—transforming complex data into visually compelling, user-friendly dashboards that resonate with both technical and non-technical stakeholders. With 3 years of specialized analytics experience across Power BI, Microsoft Fabric, Synapse, PySpark, Databricks, SQL, and Azure cloud architecture, I build end-to-end solutions that reveal the true story of your data. I've helped clients with: - Data Visualization and story telling - Dashboard development - Power BI and Tableau - ETL/ELT Pipeline Development - Data Warehouse Design and developmentPyspark
BigQueryDatabricks PlatformETL PipelinePySparkMicrosoft AzureData EngineeringSQLLooker StudioBusiness IntelligenceDashboardData AnalysisData VisualizationMicrosoft Power BI - $25 hourly
- 5.0/5
- (2 jobs)
A highly motivated person with strong technical, problem-solving, and excellent time management skills who likely to create an impact on the organization/work, he is part of and always love to socialize and experience new things in life. My hunger for the new challenges make me unique.Pyspark
MySQLScalaMicrosoft AzureData AnalyticsSnowflakeSQL ProgrammingData EngineeringData Warehousing & ETL SoftwareETL PipelinePySparkApache SparkSQLDatabricks PlatformPython - $25 hourly
- 5.0/5
- (7 jobs)
My name is Abhinav Gundapaneni and I work at Microsoft as a Software Engineer. I have been associated with Microsoft for the last 3 years. Over the past few years I have gained valuable skills that are highlighted below: 1. Designing and developing ETL/ELT solutions for complex datasets from various clients using Data Engineering tools. 2. Building scalable and efficient data pipelines to handle large amounts of customer data efficiently. 3. Solid understanding of Azure cloud infrastructure. 4. 4+ years of working with web and software applications on Python. 5. Developing PySpark notebooks and applications that handle large datasets and complex requirements. 6. Hands-on experience in managing hundreds of SQL databases in production. 7. Designing and developing web applications at scale on Django. Apart from these skills, I’m a great team player and add value to the team's growth.Pyspark
ETLWeb ApplicationMicrosoft AzureDjangoPySparkSQLPython - $20 hourly
- 4.9/5
- (7 jobs)
I am a bioinformatics engineer with a strong background in programming languages such as Python, R and Bash. I specialize in developing scalable workflows in Nextflow and have extensive experience working with a variety of omics data types including bulk RNAseq, single cell RNAseq, whole genome sequencing, whole exome sequencing and GWAS data. My expertise includes: - Developing pipelines for quality control, alignment, and analysis of large-scale omics data - Utilizing machine learning and statistical modeling to extract insights from complex datasets - Implementing reproducible research practices for efficient data management and sharing - Collaborating with biologists and clinicians to interpret and validate results I am passionate about using my skills to drive innovation and discovery in the field of bioinformatics. I am dedicated to delivering high-quality work and ensuring client satisfaction. If you have any projects or opportunities that may benefit from my expertise, please don't hesitate to contact me. I would be happy to discuss your needs in detail and provide a tailored solution.Pyspark
Data AnalyticsBioinformaticsPySparkData MiningMachine Learning ModelKubernetesAWS CodeDeployData AnnotationAWS CodePipelineDockerPythonSQL - $35 hourly
- 5.0/5
- (1 job)
Hello there! I'm a seasoned software developer with a passion for crafting innovative solutions. Here are some of my skills- 1. Seasoned software developer specializing in Python development 2. Proficient in AWS and Azure cloud platforms for building robust and scalable systems 3. Experienced in utilizing GraphDB technologies, particularly Neo4j, for efficient data modeling 4. Skilled in PySpark for high-performance data processing in data-intensive environments 5. Enthusiastic about integrating machine learning algorithms into software solutions Committed to turning ideas into impactful applications through collaborative efforts 6. Knowledgeable in machine learning techniques and algorithms for building intelligent software solutionsPyspark
API DevelopmentPySparkScalaNeo4jAWS LambdaPythonAmazon Web Services - $25 hourly
- 4.8/5
- (8 jobs)
Experienced Senior Software Developer with over 13+ years of expertise in designing, developing, and deploying scalable software solutions. Experienced Technical Architect | Driving Innovation in IT Solutions | Expert in System Design and Integration | Expert in Scalability and Performance Optimization. • 𝗕𝗮𝗰𝗸𝗲𝗻𝗱 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁: Expertise in 𝗣𝘆𝘁𝗵𝗼𝗻 with frameworks like 𝗗𝗷𝗮𝗻𝗴𝗼 and 𝗙𝗮𝘀𝘁𝗔𝗣𝗜 to build scalable web applications. • 𝗕𝗶𝗴 𝗗𝗮𝘁𝗮 𝗣𝗿𝗼𝗰𝗲𝘀𝘀𝗶𝗻𝗴: Proficient in 𝗣𝘆𝗦𝗽𝗮𝗿𝗸 for handling large-scale data processing and analytics. • 𝗦𝗲𝗮𝗿𝗰𝗵 𝗧𝗲𝗰𝗵𝗻𝗼𝗹𝗼𝗴𝗶𝗲𝘀: Experience in integrating and optimizing search platforms like 𝗘𝗹𝗮𝘀𝘁𝗶𝗰𝘀𝗲𝗮𝗿𝗰𝗵 and 𝗔𝗽𝗮𝗰𝗵𝗲 𝗦𝗼𝗹𝗿. • 𝗚𝗼 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁: Strong skills in 𝗚𝗼 (𝗚𝗼𝗹𝗮𝗻𝗴) for high-performance backend services. • 𝗥𝗲𝗮𝗹-𝘁𝗶𝗺𝗲 𝗗𝗮𝘁𝗮 𝗦𝘁𝗿𝗲𝗮𝗺𝗶𝗻𝗴: Hands-on experience with 𝗔𝗽𝗮𝗰𝗵𝗲 𝗞𝗮𝗳𝗸𝗮 for distributed event streaming and data pipelines. • 𝗙𝘂𝗹𝗹-𝗦𝘁𝗮𝗰𝗸 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁: Skilled in combining 𝗣𝗛𝗣/𝗣𝘆𝘁𝗵𝗼𝗻 with HTML, CSS and 𝗥𝗲𝗮𝗰𝘁.𝗷𝘀 to deliver seamless, interactive user interfaces and efficient server-side logic. • 𝗘𝗳𝗳𝗶𝗰𝗶𝗲𝗻𝘁 𝗔𝗣𝗜𝘀: Skilled in designing and building RESTful APIs with focus on performance and security. • 𝗘𝗻𝗱-𝘁𝗼-𝗘𝗻𝗱 𝗦𝗼𝗹𝘂𝘁𝗶𝗼𝗻𝘀: Capable of providing end-to-end development, from backend logic to search integration. • 𝗗𝗮𝘁𝗮𝗯𝗮𝘀𝗲 𝗠𝗮𝗻𝗮𝗴𝗲𝗺𝗲𝗻𝘁: Strong experience with 𝗦𝗤𝗟 and 𝗡𝗼𝗦𝗤𝗟 databases, including 𝗣𝗼𝘀𝘁𝗴𝗿𝗲𝗦𝗤𝗟, 𝗠𝘆𝗦𝗤𝗟, 𝗠𝗼𝗻𝗴𝗼𝗗𝗕, and 𝗥𝗲𝗱𝗶𝘀. • 𝗣𝗲𝗿𝗳𝗼𝗿𝗺𝗮𝗻𝗰𝗲 𝗢𝗽𝘁𝗶𝗺𝗶𝘇𝗮𝘁𝗶𝗼𝗻: Focus on delivering high-performing, scalable applications tailored to client needs. • 𝗘𝘅𝗽𝗲𝗿𝘁 𝗶𝗻 𝗟𝗶𝗻𝘂𝘅 𝗦𝗲𝗿𝘃𝗲𝗿: Proficient in configuring, maintaining, and optimizing Linux-based environments (Ubuntu, CentOS, Red Hat). • 𝗔𝗪𝗦 𝗖𝗹𝗼𝘂𝗱 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀: Experience with EC2, S3, RDS, Lambda, VPC, IAM, and other AWS services for deployment, scaling, and security management. • 𝗠𝗶𝗰𝗿𝗼𝘀𝗼𝗳𝘁 𝗔𝘇𝘂𝗿𝗲 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀: Skilled in Azure Virtual Machines, Azure Storage, Virtual Networks, and resource management with Azure CLI & PowerShell. • 𝗖𝗜/𝗖𝗗 𝗣𝗶𝗽𝗲𝗹𝗶𝗻𝗲: Experience with integrating Linux servers with Jenkins, GitLab CI, Code deploy, GitHub and Docker for automated deployments on cloud platforms.Pyspark
GolangFastAPIElasticsearchCodeIgniterDjangoPHPPySparkMySQLApache SolrApache SparkLaravelAPIApache AirflowPythonData Scraping - $3 hourly
- 5.0/5
- (1 job)
Been working as a data engineer in companies for nearly 3 years now. Have experience with Python, SQL, pyspark, ETL. AWS experience with services such as Glue, EMR, Redshift, S3, RDS and have some snowflake experience too.Pyspark
Selenium WebDriverLinuxAmazon EC2Amazon AthenaAWS LambdaAmazon S3Amazon RedshiftAWS GlueApache SparkPySparkSQLPython - $22 hourly
- 5.0/5
- (3 jobs)
Around 8 years of experience as ETL- Informatica PowerCenter/IICS . Completed end to end implementation project and worked in enhancement and support projects using Informatica PC/IICS. Used Scrum Agile project management methodologies and automated manual task using informatica PC, UNIX and VBA Macros. Strong in designing relational databases and handling complex SQL queries such as SQL Server, Oracle, SQL. Experience in Big Data Technologies such as Hadoop, Hive, Pig, Sqoop. Experience in Deep Learning Neural Networks and Natural Language Process on Python and R programming. Worked in AWS environment for development and deployment of Custom ETL Mapping.Pyspark
Data ScrapingData MiningEasyVistaData WarehousingInformaticaData AnalysisPySparkMicrosoft SQL ServerMachine LearningPython - $10 hourly
- 0.0/5
- (1 job)
With 3 years of experience in driving business insights and tackling complex data challenges, I specialize in delivering data-driven solutions to optimize business growth and mitigate risks. My hands-on experience includes working with top-tier companies like VISA, US Bank, EXL in their risk fraud analytics teams. Key Skills: Data Analysis & Visualization: Excel, Tableau Programming & Databases: Python, SQL, Hive, Hadoop, Presto, PySpark, Pandas, Numpy Deep understanding of statistical methods and applications Why Work With Me: Proven track record of delivering actionable insights that drive business growth. Experience in handling large datasets and complex analytical projects. Commitment to meeting deadlines and exceeding client expectations. I am passionate about transforming data into strategic assets for businesses. Let’s discuss how I can help you achieve your data goals. Additional Note :- Affordable and high quality work. Money earned from upwork will be used to fund my travel adventures. The rates might be cheaper because of Purchasing Power Parity being lower in India. Expect the same quality of work in 1/3rd the price, if it is outsourced from india.Pyspark
PySparkHiveApache HadoopData EngineeringData ExtractionData ProcessingBusiness IntelligenceTableauAdvanced AnalyticsMicrosoft ExcelPythonData Analysis ConsultationSQLData AnalysisData Structures - $10 hourly
- 0.0/5
- (0 jobs)
I’m a developer experienced in building REST APIs for small and medium-sized projects. Whether you’re for microservices or multilevel APIs, I can help. 1. Knows Python, Flask, Mongo DB, MySQL 2. Implement complete API architecture from scratch. 3. Regular communication is important to me, so let’s keep in touch.Pyspark
Data AnalyticsMicrosoft ExcelFlaskPySparkMySQLMongoDBPythonAPI DevelopmentDatabaseAtlassian Confluence - $15 hourly
- 0.0/5
- (0 jobs)
A mixture of geek and economist runs through my personality. This results in an insatiable thirst to understand new technologies at the bit and byte level and to relate them to current market developments. I always prefer a discussion about the sociological consequences of artificial intelligence to a bar evening, which did not impede the development of my outstanding interpersonal skills.Pyspark
Databricks MLflowSQLBig DataHivePySpark - $50 hourly
- 0.0/5
- (0 jobs)
I am a Data Engineer working full time with 2 YOE. Looking for freelancing opportunities to enhance my skills in PowerbiPyspark
PySparkApache HadoopHiveSnowflakeSQLPythonMicrosoft Power BI Data VisualizationData EngineeringData Analysis - $29 hourly
- 4.5/5
- (101 jobs)
Come to me if others are not able to scrape it ! Get data from ANY website despite HEAVY anti-bot protection! I am an expert in website crawling, scraping and mobile app scraping be it Android app or iOS app!Pyspark
Data ExtractionBig DataPySparkData ScrapingAPI IntegrationWeb CrawlingApache AirflowData VisualizationStreamlitDatabricks PlatformScrapyScalapandasPythonMachine Learning - $10 hourly
- 0.0/5
- (0 jobs)
I am a Machine Learning Engineer with 3 years of experience and a Master's degree from IIT Kanpur. I have deep expertise in Machine Learning, Deep Learning, fine-tuning LLMs, Vision-Language Models (VLMs), and building applications on top of them. I have also worked on data engineering projects where I developed ETL pipelines, wrote Spark jobs, and SQL queries. Currently, I am working as a Senior ML Engineer at one of the leading electronics manufacturing companies. My role involves designing architectures to handle big data, preprocessing it, developing ML models on wireless communication data, interpreting model performance, and deploying the models on physical devices. Previously, I spent 2 years at a startup, where I took ownership of major projects, independently developed multiple POCs, interacted with clients, and designed architectures to address their problems. In short, I’m your go-to person if you need help with: Optimizing workflows Developing, fine-tuning, or quantizing LLMs for specific use cases Explaining ML model performance Building ML applications for electronics device-generated data Solving big data challenges and transitioning from Python to Spark/DaskPyspark
Graph Neural NetworkYOLOData ExtractionText SummarizationPythonPySparkSQLRetrieval Augmented GenerationLangChainGraph DatabaseComputer VisionData VisualizationNatural Language ProcessingData AnalysisMachine Learning Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.
How do I hire a Pyspark Developer near Gurgaon, on Upwork?
You can hire a Pyspark Developer near Gurgaon, on Upwork in four simple steps:
- Create a job post tailored to your Pyspark Developer project scope. We’ll walk you through the process step by step.
- Browse top Pyspark Developer talent on Upwork and invite them to your project.
- Once the proposals start flowing in, create a shortlist of top Pyspark Developer profiles and interview.
- Hire the right Pyspark Developer for your project from Upwork, the world’s largest work marketplace.
At Upwork, we believe talent staffing should be easy.
How much does it cost to hire a Pyspark Developer?
Rates charged by Pyspark Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.
Why hire a Pyspark Developer near Gurgaon, on Upwork?
As the world’s work marketplace, we connect highly-skilled freelance Pyspark Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Pyspark Developer team you need to succeed.
Can I hire a Pyspark Developer near Gurgaon, within 24 hours on Upwork?
Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Pyspark Developer proposals within 24 hours of posting a job description.