Hire the best Pyspark Developers in Noida, IN
Check out Pyspark Developers in Noida, IN with the skills you need for your next job.
- $25 hourly
- 5.0/5
- (17 jobs)
Full Stack Developer | Python | GoLang | Web Scraping| React | Data Analytics | API Integration Hello Sir/Mam, I am a Full Stack Developer with over 7 years of experience in product development, specializing in Django, Flask,FastAPI, GO, Gin , React, Postgres, PySpark, and Pandas. Technical Specializations: Data Analytics: Python, Pandas, Numpy, PySpark Full-Stack Web Development: Django/Flask, REST/React, Postgres/MongoDB, GO/Gin Charts: D3.js, Highcharts.js, Charts.js, Google Charts Interactive React Dashboards Product Architecture Design Microservice Architecture: Deep knowledge and implementation Third-Party API Integration: Google, Facebook, and other APIs Web Scraping: Expertise in extracting and processing data Application Deployment: Comprehensive experience with GCP, AWS, Heroku, Docker Kubernetes Designing Scalable Solutions 📌 Services I Offer 📌 ⚙️ Custom Web Development: Full-stack web applications using Django, Flask, React, GoLang. Gin ⚙️ Data Analytics & Visualization: Advanced analytics with Python, Pandas, and PySpark; interactive charts and dashboards. ⚙️ API Development & Integration: Creating and integrating RESTful APIs, working with third-party APIs like Facebook, Instagram, Google, Govt Websites ⚙️ Web Scraping & Data Extraction: Extracting data from various sources, handling dynamic websites. ⚙️ Real-Time Applications: Developing chatbots, chatrooms, and real-time dashboards. ⚙️ Product Architecture Design: Designing scalable solutions and microservice architectures. ⚙️ Application Deployment: Deploying applications on GCP, AWS, Heroku, and managing CI/CD processes using Docker, Kubernetes ⚙️ Big Data Processing: Utilizing Apache Spark, Kafka, Hive for large-scale data processing. I have experience working with both startups and multinational companies. I thrive on challenging projects and continuous learning. I am committed to honesty, transparency, and discipline. I adapt to changes easily and always deliver work before the project deadline, ensuring complete client satisfaction.Pyspark
Google ChartsD3.jsWeb DevelopmentGolangWeb CrawlingWebsocketsETL PipelinepandasReactData ScrapingPySparkDjangoFlaskGoogle SheetsPython - $30 hourly
- 4.7/5
- (176 jobs)
🏆 AWARDS Winner ✅ 60K+ Upwork Hours ✅ 300+ Project Completed ✅ TOP RATED PLUS Certified ✅ Proficient English Communication ✅ NDA for each project As a seasoned Senior Technology Consultant, I bring extensive expertise in guiding organizations through complex solutions with precision and proficiency. With a proven track record of success, I specialize in providing strategic guidance and technical leadership to ensure the delivery of high-quality solutions that align with business objectives. 𝐈 𝐩𝐨𝐬𝐬𝐞𝐬𝐬 𝐚 𝐝𝐢𝐯𝐞𝐫𝐬𝐞 𝐬𝐤𝐢𝐥𝐥 𝐬𝐞𝐭 𝐚𝐧𝐝 𝐞𝐱𝐩𝐞𝐫𝐭𝐢𝐬𝐞 𝐭𝐡𝐚𝐭 𝐞𝐱𝐭𝐞𝐧𝐝 𝐚𝐜𝐫𝐨𝐬𝐬 𝐭𝐡𝐞 𝐟𝐨𝐥𝐥𝐨𝐰𝐢𝐧𝐠 𝐚𝐫𝐞𝐚𝐬:- • Skilled in building robust data pipelines, optimizing databases, and ensuring seamless data flow across systems. Experienced in utilizing a variety of tools and technologies to process, manage, and analyze large datasets efficiently. • As an AI/ML enthusiast, I'm dedicated to developing intelligent solutions that drive innovation and automation. • Experienced Product Engineer and Developer with a proven track record of delivering innovative solutions from concept to launch. • With a keen eye for uncovering insights into data, I specialize in transforming raw information into actionable intelligence. Using a combination of statistical analysis, data visualization, and machine learning techniques. 𝐔𝐭𝐢𝐥𝐢𝐳𝐞 𝐭𝐡𝐞 𝐟𝐨𝐥𝐥𝐨𝐰𝐢𝐧𝐠 𝐭𝐨𝐨𝐥𝐬 & 𝐭𝐞𝐜𝐡𝐧𝐨𝐥𝐨𝐠𝐢𝐞𝐬:- • Python, PySpark, Airflow, NiFi, AWS, Azure • NLP, Computer vision, Deep learning, Machine Learning, Tensor Flow, LLM, Gen AI • Power BI , Tableau, Looker, Qlik Sense • React, Angular, Node, Django, JavaScript/TypeScript, MEAN, MERN • RESTful, GraphQL, Fast APIs 𝐖𝐡𝐲 𝐝𝐨 𝐰𝐞 𝐞𝐱𝐜𝐞𝐥 𝐚𝐬 𝐭𝐡𝐞 𝐢𝐝𝐞𝐚𝐥 𝐞𝐧𝐠𝐢𝐧𝐞𝐞𝐫𝐢𝐧𝐠 𝐭𝐞𝐚𝐦 𝐟𝐨𝐫 𝐲𝐨𝐮𝐫 𝐩𝐫𝐨𝐣𝐞𝐜𝐭?:- • Committed to utilizing cutting-edge technologies, tools, and development patterns to ensure the highest quality standards. • Utilize open-source tools to keep the initial capex low. • Employing the Agile methodology to enhance development efficiency. • Utilizing professional task management tools such as Jira, Trello, GitHub, and Slack for streamlined workflow and organization. • A Certified PMP Manager will be assigned to oversee your project. • Regular provision of daily updates, weekly builds, and comprehensive project progress reports. • Offering end-to-end assistance and post-launch support to ensure ongoing maintenance and optimal performance. • Dedicated to gathering user feedback and implementing numerous modifications to enhance the functionality and usability of the project. If you have specific project requirements or seek dedicated resources or teams to enhance your organization's capabilities, please don't hesitate to reach out or send me an invitation to your job post. I will respond at my earliest convenience. Thank you!Pyspark
SnowflakeDatabricks PlatformMachine Learning ModelGoogle Cloud PlatformTableauMicrosoft Power BIDjango StackMERN StackNatural Language ProcessingData EngineeringAmazon RedshiftAWS GlueApache AirflowPythonPySpark - $40 hourly
- 4.7/5
- (2 jobs)
Profile Overview: Hi there! I'm Anuj, a highly skilled Data Engineer Architect with over 10 years of full time experience in designing and implementing SQL ,ETL, Cloud(AWS & Azure), Big Data, and Data Warehousing solutions on various on-premise and cloud platforms. During my tenure, I have been associated full-time with renowned companies like Deloitte, TCS and Medifast, where I have worked with 10+ clients (Pharma, Healthcare, Manufacturing and Lifesciences) and have build up strong professional relationships. Now, I am excited to transition into the world of freelancing. ⭐ "My Upwork profile doesn't yet have reviews", that's how everyone starts. What sets me apart is my commitment to forging long-term relationships with clients. I am dedicated to helping you grow with your data through continuous, strategic support, rather than just delivering quick, one-off projects. Let's work together to achieve your data-driven goals and drive lasting success. Key Skills and Expertise: ✔SQL: PostgreSQL, Oracle, SQL Server, MySQL, SparkSQL, HQL ✔Azure: Data Factory, Databricks, Data Lake ✔Amazon: Redshift, Aurora, EMR, Glue ✔ETL Tools: Microsoft SSIS, Informatica ✔Big Data Technologies: PySpark, Pandas, NumPy, Hive ✔Design: Data Modelling, Data Warehousing, Architecture, Data Marts, Star and Snowflake Schemas ✔Scripting: Python Feel free to connect.Pyspark
pandasNumPySQL ProgrammingDatabricks PlatformAmazon AthenaAmazon RedshiftCloud ServicesSQL Server Integration ServicesData ModelAWS GlueData LakeMicrosoft AzurePySpark - $50 hourly
- 0.0/5
- (0 jobs)
I am a Data Engineer specializing in delivering end-to-end solutions for data capture, transformation, and reporting, helping businesses boost productivity and make informed decisions. I have extensive experience with Azure data tools, including Azure Data Factory, Databricks, Azure Functions, and Microsoft Fabric. I also create interactive, insightful dashboards using Power BI to enhance data visualization and analysis. I take full ownership of projects from start to finish, ensuring clear and frequent communication with regular updates throughout the project lifecycle.Pyspark
PySparkC#Microsoft Power BIMicrosoft AzureData AnalysisETL PipelineETL - $35 hourly
- 0.0/5
- (0 jobs)
Experienced Cloud Data Engineer with 5+ years expertise in Apache Spark & Databricks for big data processing on AWS cloud platforms. Proven track record of building scalable data pipelines, optimizing Spark jobs, and delivering high-performance data solutions for real-time analytics and streaming applications. Skilled in Python, SQL, Bigdata, Snowflake and cloud native services, with a strong focus on ETL processes and data warehousing. Expertise in AWS Data Engineering, PySpark, SQL, DatabricksPyspark
Software QAETL PipelineETLData ExtractionApache HadoopApache SparkAmazon API GatewayAmazon AthenaAmazon RedshiftAWS LambdaAmazon EC2AWS GlueSQLDatabricks PlatformPySpark - $15 hourly
- 0.0/5
- (0 jobs)
I am a Software Engineer with 4 years of experience in data engineering delivering impactful solutions for international clients. Proficient in Python, SQL and cloud technologies, leveraging tools such as Azure Databricks and Power BI to drive data analytics and streamline processes. Skilled in managing complex environments and releases, ensuring seamless operations and support.Worked on Pandas and PySpark libraries to perform ETL processesPyspark
Amazon Web ServicesData Warehousing & ETL SoftwareData EngineeringPySparkPythonSQLMicrosoft Azure - $50 hourly
- 0.0/5
- (0 jobs)
IT professional, offering 15+years of experience in big data ,spark, snowflake, cloue migration,scala Informatica Power Center 9.5, hive,Impala,sQoop, rundeck , SSIS, Oracle9i/10g, PL/SQL, Teradata, Shell-Scripting, UNIX & Apache Hadoop.Pyspark
Apache HadoopPySparkSnowflakeCloud ApplicationData AnalysisMachine LearningAnalytical PresentationData ExtractionETL - $5 hourly
- 0.0/5
- (0 jobs)
Hello there! As a seasoned professional in Data Analytics, I am passionate about driving growth and enabling financial institutions to thrive in the digital era. With a strong background in analyzing market trends, identifying business opportunities, and developing innovative strategies, I bring a data-driven approach to optimizing lending processes and enhancing customer experiences. I excel in leading cross-functional teams, collaborating with stakeholders, and translating business requirements into actionable insights. My expertise lies in leveraging cutting-edge technologies and data analytics to drive transformative solutions that address key industry challenges. By harnessing my analytical mindset, I guide in making informed decisions, optimizing operations, and achieving sustainable business outcomes.Pyspark
Data Analytics & Visualization SoftwareExploratory Data AnalysisTensorFlowPython Scikit-LearnNumPypandasPySparkPostgreSQL ProgrammingMicrosoft Power BITableauAlteryx, Inc.SQLAnalytical PresentationData AnalysisETL Pipeline - $20 hourly
- 0.0/5
- (0 jobs)
SUMMARY Highly efficient Architect/ Data Scientist /Technical Manager in Data Science & Artificial Intelligence with 16 years of experience in IT and 9 years in Machine Learning, Python, PySpark, Deep Learning, AI-Chat-GPT, Llama2, LLM, Generative AI, Prompt Engineering, RAG, MLOps, VectorDB, Data mining with large data sets of Structured and Unstructured data, Data Acquisition, Data Validation, computer vision, openCV, Statistical modeling, NLP, Text Mining, Predictive modeling, Data Visualization and Anaconda (Jupyter, Spyder, R Studio). 5 years USA experience with Dell (Tenet healthcare), Infosys (Cummins and AT&T), and (Deloitte) PayPal. * Adept in statistical programming languages like R, PySpark and Python. * Excellent hands on experience in Statistical procedures and Machine Learning algorithms such as ANOVA, Clustering and Regression Analysis to analyze data for further Model Building. * Experience in Data Science QA. *.Excellent hands on experience in Anaconda notebooks (Jupyter, Spyder, R Studio). Experienced in data mining & loading and analyzing unstructured data -XML, JSON, flat file formats into Hadoop. *Expertise in Excel Macros, Pivot Tables, vlookups and other advanced functions and expertise R user with knowledge of statistical programming languages SAS. *Excellent hands on experience in Hadoop eco-system like HDFS, PySpark and Hive. *Hands on experience in Natural Language Processing (NLP) like Sentiment Analysis. *Hands on experience in Random Forests, Decision Trees, Linear and Logistic Regression, SVM, Clustering. *Hands on experience in Deep Learning- Neural Network LSTM, RNN and CNN. *Extensive experience in Data Visualization including producing tables, graphs, listings using various procedures and tools such as Tableau. *Extensive experience in Artificial Intelligence– MLOps, LLM, OpenAI, Chat-GPT, Llama2, LAG, Vector-DB and Prompt Engineering, Generative AI.Pyspark
Microsoft AzureNLP TokenizationDeep LearningChatGPTLLM Prompt EngineeringTableauPySparkPythonData AnalysisAnalytical PresentationData MiningMachine LearningMachine Learning ModelArtificial Intelligence - $10 hourly
- 0.0/5
- (0 jobs)
Certainly! Here's a more polished and professional version of the intro, suitable for resumes, LinkedIn, or professional bios: --- **MDM Analyst | 4+ Years of Experience | Specializing in Databricks, SQL, Python & PySpark** Experienced Master Data Management (MDM) Analyst with over 4 years of expertise in designing, developing, and maintaining enterprise-level data management solutions. Proficient in implementing scalable MDM frameworks and driving data quality initiatives that align with organizational objectives. Skilled in leveraging modern data platforms such as Databricks, and advanced tools including SQL, Python, and PySpark to process large datasets, enable data integration, and support analytics-driven decision-making. Known for a strong analytical mindset, collaborative approach, and a deep understanding of data governance, metadata management, and master data architecture.Pyspark
Informatica Data QualityMaster Data ManagementTransact-SQLOracleDatabricks PlatformPostgreSQLPySparkPython - $20 hourly
- 0.0/5
- (0 jobs)
SUMMARY Data Engineer with 5 years of experience specializing in SQL Server, PostgreSQL, Azure Data Factory (ADF), PySpark, Python and Tableau. Expertise in optimizing database performance, query tuning, and developing efficient data pipelines. Proven track record in building scalable ETL solutions and automating data workflows. Skilled in creating highperformance, data-driven applications to support business analytics and decision-making. ACHIEVEMENTS Successfully migrated large amount of data to a new database, minimizing downtime and ensure data integrity. Designed and developed a high performance database that increased query speed by 50%.Pyspark
Oracle PLSQLDatabricks PlatformMySQLPostgreSQLPythonDatabase DevelopmentDatabaseETL PipelineMicrosoft Azure SQL DatabasePySpark - $15 hourly
- 0.0/5
- (0 jobs)
Business-focused data and technology expert with 7 years of expertise in the Life Sciences and Pharmacovigilance (PV) domain, specializing in Data Warehousing, Business Intelligence, Data Marts, and Data Modeling. I have a strong background in Data Virtualization, Data Visualization, Orchestration, and Data Fabric/Mesh. As a PV Consultant, I excel in developing and validating standard and ad-hoc reports for PBRER, DSUR, Signal Detection, and Submissions, utilizing tools like Power BI, SAP Business Objects, IBM COGNOS, and Logi Analytics.Pyspark
Data EngineeringDashboardAnalytics DashboardAmazon QuickSightGitHubPharmacovigilanceCognosMicrosoft Power BISQLdbtMicrosoft AzureDatabricks PlatformPySparkSnowflakeETL Pipeline - $25 hourly
- 0.0/5
- (0 jobs)
SUMMARY Data Analytics Engineer with 3+ years' experience in building data pipelines, analytics , and data warehousing in supply chain and healthcare. Skilled in SQL, Python, and dashboard creation to drive data-driven decisions. Strong communicator adept at translating complex solutions for cross-functional teams to deliver strategic insights.Pyspark
Microsoft Power BITableauApache AirflowPySparkSnowflakeMicrosoft AzureData ModelingData VisualizationPythonSQLETL PipelineData AnalysisETL - $30 hourly
- 0.0/5
- (0 jobs)
I'm a seasoned Data Engineer with 7+ years of experience in cloud-based data solutions. Proficient in Python, SQL, and cloud platforms like Azure and AWS, I specialize in designing intricate data pipelines and optimizing ETL processes to drive business insights. My skills also extend to ETL tools such as Dataiku and Apache Airflow, along with machine learning expertise. Notable Projects and Achievements: Microsoft: Engineered event streaming solution using Pyspark on Synapse, processing Azure Event Hub source to Delta Lake target. Microsoft: Innovated data quality solution with Langchain and OpenAI, generating Pyspark code from English prompts. Freelancing: Architected Amazon S3 migration for an agriculture sector client, employing Dagster, Pyspark, and SQL. Genpact: Spearheaded migration of semi-structured JSON data to Snowflake using Python, SQL, and S3 buckets. Udemy: Achieved 98% accuracy in breast cancer classification using SVM model and implemented data normalization and grid search. Skills: Python, SQL, Pyspark, Microsoft Fabric ETL, Apache Dagster, Apache Kafka Langchain, Large language model Apache Airflow, Machine Learning, AWS, Dataiku, Azure DevOps Apache Spark Streaming, Azure, CI/CD, Synapse, Event HubPyspark
Azure Cosmos DBMicrosoft Azure SQL DatabaseAzure DevOpsMicrosoft AzurePySparkUnixSQLPythonClassificationMachine LearningLinear RegressionLogistic RegressionDatabricks Platform - $20 hourly
- 0.0/5
- (1 job)
With over 7 years of comprehensive IT experience, I possess a versatile skill set that encompasses both backend development, leveraging technologies such as PHP, Laravel, Java, and Python, and frontend development, including Android App Development, iOS App Development, and cross-platform solutions. My collaborative approach extends to a dedicated team of experts with diverse technology stacks, including Big Data, AI, ML, and more. Driven by self-motivation, I consistently embrace opportunities for learning and growth, demonstrating an unwavering commitment to staying at the forefront of emerging technologies.Pyspark
Mobile AppAPIDesktop ApplicationBig DataWeb DesignPySparkHTMLWeb ApplicationLaravelAndroid App DevelopmentPHPJavaFlutteriOS DevelopmentPython - $25 hourly
- 0.0/5
- (0 jobs)
Working as enthusiastic data engineer since 6 yrs. Here to learn and enhance my skills and to be more stable in life.Pyspark
Oracle Integration Cloud ServicePySparkCloud ServicesTableauMicrosoft ExcelPythonSQL - $12 hourly
- 0.0/5
- (0 jobs)
I’m a Data Engineer experienced in building Data Platforms, Data Ingestion Frameworks for small to large scale clients. Microsoft Azure, Databricks, PySpark, SQL, OOP Programming, PythonPyspark
Data ModelingFivetranSnowflakeMicrosoft AzureData CleaningData EngineeringSQLETL PipelineDatabricks MLflowDatabricks PlatformPySpark - $30 hourly
- 0.0/5
- (0 jobs)
I have over 20 years of experience in the field of industry experience. Demonstrated expertise in designing and implementing innovative solutions( end to end) that leverage cutting-edge technologies to drive business success. I have in-depth understanding of Big Data technologies, including Hadoop, Spark, NoSQL databases, and data warehousing. I have extensive experience in designing and implementing scalable, high-performance systems for data processing, analytics, and predictive modeling and proficient in cloud computing platforms, such as Amazon Web Services (AWS) , Microsoft Azure,Google Cloud Platform(GCP)Pyspark
PySparkApache HBaseHiveKubernetesDockerMicrosoft AzureGoogle Cloud PlatformAWS DevelopmentApache SparkApache NiFiMySQLJavaScalaAkkaBig Data Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.
How do I hire a Pyspark Developer near Noida, on Upwork?
You can hire a Pyspark Developer near Noida, on Upwork in four simple steps:
- Create a job post tailored to your Pyspark Developer project scope. We’ll walk you through the process step by step.
- Browse top Pyspark Developer talent on Upwork and invite them to your project.
- Once the proposals start flowing in, create a shortlist of top Pyspark Developer profiles and interview.
- Hire the right Pyspark Developer for your project from Upwork, the world’s largest work marketplace.
At Upwork, we believe talent staffing should be easy.
How much does it cost to hire a Pyspark Developer?
Rates charged by Pyspark Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.
Why hire a Pyspark Developer near Noida, on Upwork?
As the world’s work marketplace, we connect highly-skilled freelance Pyspark Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Pyspark Developer team you need to succeed.
Can I hire a Pyspark Developer near Noida, within 24 hours on Upwork?
Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Pyspark Developer proposals within 24 hours of posting a job description.