Hire the best Hadoop Developers & Programmers in Chennai, IN
Check out Hadoop Developers & Programmers in Chennai, IN with the skills you need for your next job.
- $50 hourly
- 5.0/5
- (19 jobs)
Hi, I'm Rajesh, a Senior SaaS Developer & Data Engineer with expertise in Python, Java, Scala, and cloud technologies (GCP, AWS, Azure AI). I’ve built and scaled AI-powered applications, developed RAG-based chatbots, and designed large-scale data pipelines. As a Founding Engineer at Labrador AI, I led backend architecture, payment integrations, and DevOps. I’m passionate about solving complex problems, mentoring, and scaling businesses with AI-driven solutions.Hadoop
ETL PipelineData ScienceDatabase ArchitectureKubernetesMySQLApache KafkaDjangoAkka-HTTPAngularScalaApache HadoopPythonMapReduceJava - $35 hourly
- 5.0/5
- (32 jobs)
Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, Trino, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data CatalogHadoop
SQLAWS GluePySparkApache CassandraETL PipelineApache HiveApache NiFiApache KafkaBig DataApache HadoopScalaApache Spark - $100 hourly
- 4.8/5
- (2 jobs)
AI and Cloud Data Engineer with over 15 years of Practical Experience in Banking and Networking Domain. Well-versed in defining requirements, designing solutions, and building solutions at an enterprise grade. A Passionate Programmer and Quick Troubleshooter. Strong grasp on Java, Python, Big Data Technologies, Data Engineering/Analysis and Cloud Computing.Hadoop
Apache BeamApache FlinkApache SparkData ScienceMicrosoft Power BIData MiningApache HadoopETLPythonData Extraction - $50 hourly
- 0.0/5
- (0 jobs)
* Extensive experience of professional software development for a challenging, large scale, multi-tiered, distributed software applications in diverse domains. * Followed Agile Scrum Methodology, Test Driven Development (TDD), Continuous Integration (CI). * Exceptional problem solving skills, self-motivated, good team player, Quick learner. * Clear understanding of business-driven software development process. Flexible, with a proven ability to work well under pressure to meet aggressive deadlines. * Result-driven, highly motivated professional with exceptional success in managing small to major projects from conceptualization through implementation. * Expertise in creating SOA based webservices using technologies like JAXB, SOAP, REST, etc.Hadoop
DatabaseProblem SolvingJava Persistence APISpring IntegrationSpring FrameworkApache HadoopSOAPREST APIEJBJava - $16 hourly
- 5.0/5
- (4 jobs)
If you are looking for End to End Data Developer.. Right from extracting the data , cleaning it and till the Visualization. Im the Guy... Please feel free to reach me TECHNICAL SKILLS BigData Platform : - AWS EMR,Huawei Fusion Insight, Cloudera, Azure Data Bricks BigData Technologies : - HDFS, Hive, Spark core, Spark SQL, Oozie, Spark Framework Streaming (Real-time) : - Spark Streaming (DStream & Structured stream) & Kafka Languages : - Scala, Python (Pyspark), SQL, UNIX shell scripting Development tools : - Scala IDE, IntelliJ Cloud : - AWS Data Services,Azure Data Lake, Azure Data Factory, Azure Data Bricks Databases : - Apache Hudi, SnowFlake, Azure SQL DB, MS SQL Server, Oracle 11g ETL : - Talend Data Fabric (DI and Big data), TAC, Talend Cloud, TMC, Informatica Power Center, Informatica Cloud (IICS & ICAI),SSIS CICD : - Jenkins, Nexus, Maven, GIT Visualization. :- TableauHadoop
Apache HiveMicrosoft SQL Server ProgrammingTalend Open StudioApache SparkScalaMicrosoft AzureSnowflakeTalend Data IntegrationSQLInformatica CloudOracle AccountingHDFSMicrosoft Azure SQL DatabaseInformaticaApache HadoopDatabricks PlatformApache Kafka - $25 hourly
- 0.0/5
- (0 jobs)
Hi, Iam professional developer having a working experience in PHPMySql, Wordpress, Laravel Framework, and with Java web applications with oracle and Sql Db as Backend.Hadoop
Apache HadoopCEnd User Technical SupportMedical BillingQA TestingSoftware TestingPerformance TestingRegression TestingFunctional TestingWeb TestingManual TestingMySQL ProgrammingTableauLaravelSQLWordPressPHP - $20 hourly
- 5.0/5
- (3 jobs)
Data Integration & Big Data Specialist | Talend, Spark, Airflow, AWS Expert I am a seasoned Data Integration and Big Data Specialist with over 9 years of experience at Sopra Steria, delivering cutting-edge solutions using Hadoop, Talend, Spark, Airflow, and cloud technologies. My expertise lies in designing, implementing, and optimizing robust data pipelines and systems to handle complex business requirements. Key Strengths & Skills Data Engineering Expertise: Proficient in processing large datasets, both structured and semi-structured, with advanced Spark performance tuning to maximize efficiency. Cloud Specialist: Extensive experience in AWS services, including Redshift, S3, EMR, EC2, and Lambda, ensuring scalable and secure data processing. ETL Mastery: Designed and developed highly efficient Talend jobs for ETL processes, with a focus on performance optimization and scheduling in EC2. Orchestration & Automation: Expert in Apache Airflow for orchestration, designing complex DAGs, and automating data validations using shell scripting. CI/CD Implementation: Skilled in implementing CI/CD pipelines using Jenkins, streamlining deployment processes for enhanced efficiency. Cross-System Data Sync: Expertise in synchronizing data between AWS Redshift and Netezza. Notable Achievements Spark Optimization: Successfully tuned Spark jobs for large-scale data processing on Amazon EMR clusters, achieving significant performance improvements. Automation Initiatives: Designed and automated complex data validation frameworks, reducing manual intervention and increasing accuracy. Client Collaboration: Participated in sprint planning meetings, design discussions, and user story clarifications, ensuring smooth development and testing phases. Production Deployment: Led the deployment of Talend and Spark jobs in production clusters with meticulous configuration and monitoring. Education & Certifications B.Tech - Information Technology With a passion for solving data challenges and a proven track record of delivering impactful solutions, I am ready to contribute to your project with efficiency and precision. Let’s collaborate to turn your vision into reality!Hadoop
Apache SparkApache HadoopData Warehousing & ETL SoftwareETL PipelineApache AirflowAmazon EC2Amazon Web ServicesTalend Open StudioAmazon RedshiftTalend Data Integration - $15 hourly
- 0.0/5
- (0 jobs)
Skilled Data Engineer with over 2 years 5 months of experience. Expertise in building and optimizing data pipelines, ETL processes, and scalable data solutions. Strong problem-solving abilities with a focus on delivering actionable insights through big data technologies.Hadoop
Data WarehousingSnowflakedbtApache HadoopHiveApache KafkaMicrosoft Power BITableauMicrosoft AzurePySparkPythonSQLMySQLMicrosoft SQL ServerSQL Server Integration Services - $15 hourly
- 0.0/5
- (0 jobs)
Dedicated Specialist Programmer at Infosys around 4 years of experience as a Big Data developer. Driven by work focussing on automation and technology integrations, adept at understanding stakeholder needs and committed to delivering impactful and robust solutions. I approach solutions as a starting point for continuous improvement.Hadoop
UnixSQLJavaPythonApache HadoopPySparkETL - $12 hourly
- 0.0/5
- (0 jobs)
DATA SCIENTIST | AI & MACHINE LEARNING EXPERT | DEEP LEARNING ENTHUSIAST WHAT'S SO SPECIAL ABOUT ME? Data Scientist with expertise in Deep Learning, Machine Learning, and AI. Proficient in Python, R, and statistical analysis, with hands-on experience in neural networks, NLP, and computer vision. Earned a Master's in Data Science (2024) from Women's Christian College and completed an internship at Inofii Technologies. ?? 2nd Place Hackathon Winner | ?? AI & Deep Learning Enthusiast | ?? Quick Learner Passionate about solving complex problems through data-driven insights. Ready to drive innovation and impact with AI-powered solutions.Hadoop
PythonRApache HadoopMongoDBSQLETLData ExtractionArtificial IntelligenceMachine Learning ModelData Mining - $15 hourly
- 5.0/5
- (2 jobs)
Overall 6 years of experience in IT industry with 4 years of relevant experience in Big Data Engineer, handling and transforming heterogeneous data into Key information using Hadoop ecosystem. - Expertise with the tools in Hadoop Ecosystem – HDFS, Hive , Sqoop, Spark, Kafka, Nifi. - Experience working with Elastic Search, Kibana and good knowledge on Oozie, Hbase, Phonix. - Good understanding of distributed systems, HDFS architecture, internal working details of MapReduce, Yarn and Spark processing frameworks. - More than two year of hands on experience using Spark framework with Scala. - Expertise in Inbound and Outbound (importing/exporting) data form/to traditional RDBMS using ApacheSQOOP. - Extensively worked on HiveQL, join operations, writing custom UDF’s and having good experience in optimizing Hive Queries. - Experience in data processing like collecting, aggregating, moving from various sources using Apache Nifi and Kafka. - Worked with various formats of files like delimited text files , JSON files, XML Files - Having basic knowledge on Amazon Web Services.Hadoop
Data LakeAWS CloudFormationAWS GlueElasticsearchKibanaSqoopApache NiFiPySparkScalaSQLApache HadoopApache KafkaApache HiveApache Spark - $25 hourly
- 0.0/5
- (1 job)
I am a highly skilled data scientist with extensive experience in Natural Language Processing (NLP), Deep Learning, and Big Data technologies. With a proven track record of delivering intelligent solutions for various business applications, I specialize in: NLP: Expertise in text analysis, sentiment analysis, chatbots, language modeling, and other NLP tasks using cutting-edge tools such as spaCy, Hugging Face, and TensorFlow. I can help extract meaningful insights from unstructured data and enhance customer interactions. Deep Learning: Proficient in building and optimizing neural networks, including CNNs, RNNs, and transformers. I work with frameworks like TensorFlow and PyTorch to create state-of-the-art models for image recognition, natural language understanding, and predictive analytics. Big Data: Skilled in handling large datasets using Hadoop, Spark, and cloud-based solutions (AWS, GCP). I can help design, build, and scale big data architectures that drive business decisions and innovation. I offer end-to-end support, from data preprocessing to model deployment, with a focus on delivering high-performance, scalable solutions. Let’s work together to turn your data into valuable assets!Hadoop
HiveR HadoopApache HadoopMongoDBTableauSASPythonMachine Learning Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.
How do I hire a Hadoop Developer & Programmer near Chennai, on Upwork?
You can hire a Hadoop Developer & Programmer near Chennai, on Upwork in four simple steps:
- Create a job post tailored to your Hadoop Developer & Programmer project scope. We’ll walk you through the process step by step.
- Browse top Hadoop Developer & Programmer talent on Upwork and invite them to your project.
- Once the proposals start flowing in, create a shortlist of top Hadoop Developer & Programmer profiles and interview.
- Hire the right Hadoop Developer & Programmer for your project from Upwork, the world’s largest work marketplace.
At Upwork, we believe talent staffing should be easy.
How much does it cost to hire a Hadoop Developer & Programmer?
Rates charged by Hadoop Developers & Programmers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.
Why hire a Hadoop Developer & Programmer near Chennai, on Upwork?
As the world’s work marketplace, we connect highly-skilled freelance Hadoop Developers & Programmers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Hadoop Developer & Programmer team you need to succeed.
Can I hire a Hadoop Developer & Programmer near Chennai, within 24 hours on Upwork?
Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Hadoop Developer & Programmer proposals within 24 hours of posting a job description.