Hire the best Apache Hive Developers in Chennai, IN

Check out Apache Hive Developers in Chennai, IN with the skills you need for your next job.
  • $80 hourly
    I have around 20 years of software development experience using Java and Python. Throughout my career I have worked in different startups which were at different stages. Languages used: Java, C, Python Tools/Frameworks: Apache Hadoop, Hive, Spark, Spring boot, Apache tomcat, Apache Airflow, Apache Falcon, Apache Oozie, Flask, React JS, Python pandas, Kubernetes Cloud technologies: AWS Elastic Beanstalk, AWS Lambda, Athena, AWS S3, Amazon Redshift, AWS Managed airflow, EKS, MSK, Snowflake Operating system: Unix, Linux( Redhat, Ubuntu, Centos, Fedora) IDE: Eclipse, Intellij Experience in building applications in NMS/EMS, Online video advertising, Big data, Recommendation systems domains. Worked in building DSP, RTB bidders, Ad network, SSP & Ad Exchange integrations for video advertising using OpenRTB delivering ads through VAST inline, wrappers responses. Frequency capping, pacing, budgeting, forecasting, day parting, user/cookie sync, targeting (geo,content,site/app,publisher,time,segment) Programming experience in backend API and Big data application development. Good knowledge in AWS cloud solutions like EC2, S3, Elastic Beanstalk, Lambda, Redshift. Worked closely with the application development team and the data science teams to integrate the machine learning models in the production system. Has experience in leading a highly collaborative engineering team. Built data platform, data pipelines for processing a large volume of data using big data technologies. Have experience in building systems that can handle billions of requests per day
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    Data Management
    Big Data
    Core Java
    Web Crawling
    Spring Boot
    ETL Pipeline
    API Development
    Apache Airflow
    pandas
    Apache Spark
    Python
  • $35 hourly
    Seasoned, solution-oriented engineer with 10 years of experience in designing and implementing robust systems. Highly experienced in near real time streaming analytics, distributed micro-services architecture and reactive systems. Worked on multiple areas of development from design, coding to performance tuning, customer issues and cost saving automation.
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    Apache Spark
    Cloudera
    MySQL
    RESTful Architecture
    Java
    Kubernetes
    Python
    Terraform
    MongoDB
    Cloud Architecture
    Analytics
    NGINX
    Google Cloud Platform
    Apache Kafka
    Apache Airflow
    Spring Boot
  • $30 hourly
    Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data Catalog
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    SQL
    AWS Glue
    PySpark
    Apache Cassandra
    ETL Pipeline
    Apache NiFi
    Apache Kafka
    Big Data
    Apache Hadoop
    Scala
    Apache Spark
  • $40 hourly
    I am a professional with around 13 years of experience in software engineering, with substantial expertise in Big Data technologies. * Good experience in Spark, NoSQL, Kafka, Spark Streaming, Hive, and MySQL. * Good experience in Python & have basic knowledge in Core Java and Golang. * Big Data Development, Data Modeling, end-to-end data Orchestration frameworks, and Business Intelligence Reporting. * Directing the full development cycle of solution designing, development, testing, deployment, and documentation for complex projects and products in the domain of data analytics; built and delivered comprehensive data strategy road map; ensured final deliverables were of the highest quality. * Proficient with Batch Data Processing, Real Time Streaming, and Data Cleansing and Data quality checks.
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    Apache Hadoop
    Java
    MySQL
    Apache Cassandra
    Apache Kafka
    Golang
    Python
    Apache Spark
    Problem Solving
    Big Data
    Architectural Design
    Data Engineering
    Solution Architecture
  • $60 hourly
    Certified AWS, GCP, Azure Data Engineer with 9.5 years of experience in Data Engineering including both traditional and big data analytics. I have worked in AWS Big Data engineering domain for 5 yrs in the following services- Opensearch, EMR, Glue, EC2, S3, VPC, SNS, SQS, Kinesis, Redshift, Lambda, etc. I have 6+ yrs of experience in Big Data domain exclusively in Spark and Hive. I have worked across multiple traditional DBs like Oracle, SQL Server, Netezza and well versed in SQL queries.
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    Amazon Athena
    Azure DevOps
    Amazon S3
    AWS Lambda
    Amazon Redshift
    Snowflake
    Databricks Platform
    Apache Spark
    Apache Kafka
    Apache NiFi
    SQL
    AWS Glue
    Scala
    Apache Airflow
    Python
  • $15 hourly
    Overall 6 years of experience in IT industry with 4 years of relevant experience in Big Data Engineer, handling and transforming heterogeneous data into Key information using Hadoop ecosystem. - Expertise with the tools in Hadoop Ecosystem – HDFS, Hive , Sqoop, Spark, Kafka, Nifi. - Experience working with Elastic Search, Kibana and good knowledge on Oozie, Hbase, Phonix. - Good understanding of distributed systems, HDFS architecture, internal working details of MapReduce, Yarn and Spark processing frameworks. - More than two year of hands on experience using Spark framework with Scala. - Expertise in Inbound and Outbound (importing/exporting) data form/to traditional RDBMS using ApacheSQOOP. - Extensively worked on HiveQL, join operations, writing custom UDF’s and having good experience in optimizing Hive Queries. - Experience in data processing like collecting, aggregating, moving from various sources using Apache Nifi and Kafka. - Worked with various formats of files like delimited text files , JSON files, XML Files - Having basic knowledge on Amazon Web Services.
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    Elasticsearch
    Kibana
    Sqoop
    Apache NiFi
    PySpark
    Scala
    SQL
    Apache Hadoop
    Apache Kafka
    Apache Spark
  • $16 hourly
    Experienced Data Engineer with a strong background in designing and optimizing data solutions. Skilled in creating scalable data architectures, implementing advanced analytics, and ensuring data integrity for strategic decision-making. Committed to driving efficiency and innovation through collaborative problem-solving and the use of cutting-edge technology. TECHNICAL SKILLS BigData Platform : - AWS EMR,Huawei Fusion Insight, Cloudera, Azure Data Bricks BigData Technologies : - HDFS, Hive, Spark core, Spark SQL, Oozie, Spark Framework Streaming (Real-time) : - Spark Streaming (DStream & Structured stream) & Kafka Languages : - Scala, Python (Pyspark), SQL, UNIX shell scripting Development tools : - Scala IDE, IntelliJ Cloud : - AWS Data Services,Azure Data Lake, Azure Data Factory, Azure Data Bricks Databases : - Apache Hudi, SnowFlake, Azure SQL DB, MS SQL Server, Oracle 11g ETL : - Talend Data Fabric (DI and Big data), TAC, Talend Cloud, TMC, Informatica Power Center, Informatica Cloud (IICS & ICAI),SSIS CICD : - Jenkins, Nexus, Maven, GIT
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    Microsoft SQL Server Programming
    Talend Open Studio
    Apache Spark
    Scala
    Microsoft Azure
    Snowflake
    Talend Data Integration
    SQL
    Informatica Cloud
    Oracle Accounting
    HDFS
    Microsoft Azure SQL Database
    Informatica
    Apache Hadoop
    Databricks Platform
    Apache Kafka
  • $50 hourly
    Freelancer ,Technical Lead, Over All 12+Years Experience ,Vast Experience in SQL,SSRS,VB,VBA [9+ Years] ,Moderate Experience in Spark,Hive [3+Years] , Experienced in Application Development,Report Design, Big data Handling.
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    SQL Server Reporting Services
    SAP Crystal Reports
    SQL
    Apache Spark
    Visual Basic for Applications
    Visual Basic
  • $25 hourly
    Presently a Data Engineer with more than 10+ years of hands-on experience in building batch, Streaming, and replication Ingestion pipelines, I recently earned a Google Professional Data Engineer Certification & AWS certified professional data analytics Certification. I’m an expert in implementing advanced algorithms and integrating them within project architecture, as well as developing applications against various NoSQL databases. I also re-designed a critical ingestion pipeline which increased the volume of processed data by 50%. This is why I am certain I make a perfect candidate for the senior Data Engineer position and I am happy to officially submit my job application. less Describe your experience with Data Engineering within Ingestion Hi Team, I have 8+ work experience in data engineering for creating Batch/Streaming & Replication Ingestion Pipelines. I have used utilized many technologies like spark/Scala/Python, Kafka, Flink, Spark structure streaming, Airflow. Also, I have worked on more than 10 + ingestion end-to-end pipelines by applying data catalog, Data quality also with test data management. I have solid 3+ experience in Cloud (AWS, GCP & AZURE) for creating Ingestion pipelines. Connected and migrated many sources like Teradata, Netezza, Salesforce, Snowflake, RDBMS, and Cloud. Last two years I am working also as a Solution Architect which will provide end-to-end solutions. Also, I am good at Dev ops. Further more Certified in AWS, AZURE, and GCP. Please find my attached Resume.
    vsuc_fltilesrefresh_TrophyIcon Apache Hive
    Unix Shell
    Sqoop
    Apache HBase
    Databricks Platform
    Apache Spark
    Java
    Python
    Scala
    Apache Kafka
    SQL
    Apache Hadoop
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Hive Developer near Chennai, on Upwork?

You can hire a Apache Hive Developer near Chennai, on Upwork in four simple steps:

  • Create a job post tailored to your Apache Hive Developer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Hive Developer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Hive Developer profiles and interview.
  • Hire the right Apache Hive Developer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Hive Developer?

Rates charged by Apache Hive Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Hive Developer near Chennai, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Hive Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Hive Developer team you need to succeed.

Can I hire a Apache Hive Developer near Chennai, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Hive Developer proposals within 24 hours of posting a job description.