Hire the best Amazon Redshift Developers in Pune, IN

Check out Amazon Redshift Developers in Pune, IN with the skills you need for your next job.
  • $40 hourly
    Experienced AWS certified Data Engineer. Currently have around 4 years of Experience in Big Data and tools. AWS | GCP Hadoop | HDFS | Hive | SQOOP Apache Airflow | Apache Spark | Apache Kafka | Apache NIFI | Apache Iceberg Python | BaSH | SQL | PySpark | Scala | Delta Lake Datastage | Git | Jenkins | Snaplogic | Snowflake.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Amazon API Gateway
    Apache Spark
    Google Cloud Platform
    Apache Kafka
    Apache Airflow
    Big Data
    Data Migration
    Apache NiFi
    Amazon Web Services
    PySpark
    AWS Lambda
    AWS Glue
    ETL
    Python
    SQL
  • $20 hourly
    - Senior Software Engineer with 8+ years of experience in building data-intensive applications and tackling challenging architectural/ scalability problems. - Showcasing excellence in delivering analytical and technical solutions in accordance with the customer requirements - Hands-on experience in data engineering functions including but not limited to data extraction, transformation, loading, and integration in support of enterprise data infrastructures that include data warehouse, operational data stores, and master data management - Taking ownership in terms of delivery, coordinating with relevant stakeholders, updating the status to the client on a daily basis; coordinating with the testing team and other teams for fixing bugs - Relevant experience of 3+ years in Big-Data analytics and Big-data handling using Hadoop Ecosystem tools such as Hive, HDFS, Spark, Sqoop, and Yarn - Extensive experience in working on Cloud Platform such as AWS with hands-on experience in using AWS Services – S3, Glue, Lambda & Step Function, RDS (Aurora DB), and Redshift. - Applied knowledge in AWS - EMR, EC2, SNS, SQS, and CloudWatch. - More than 3 years of experience in handling and processing Unstructured and Structured data using Python, Pyspark, and SQL. - Comprehensive knowledge of query building and expertise in handling RDBMS systems – MS SQL, MySQL, PostgreSQL, and Oracle (PL/SQL). - Strong experience in creating ETL pipelines using tools like Talend Studio DI and Big Data platform - Proficient in analyzing requirements and architecture specifications to create detailed designs and providing technical advice, training, and mentoring other associates in a lead capacity - Good understanding of Machine Learning and Statistical Analysis - Having sound knowledge in the Retail domain. With beginner-level knowledge, learning more about Insurance Domain. - Participating in deployment releases and release readiness reviews, and maintaining release repository - Interacting with clients for getting requirements, participating in PI planning, and ensuring the timely completion of projects; analyzing & designing the project requirements, various modules, and their functionality - Hands-on experience in handling the On-site applications & interacting with client review meetings & Brainstorm sessions with the technical team, Team Lead & Product Delivery Manager. - Excellent Analytical skills often lead to discovering requirement gaps at an early stage which ultimately helps in timely delivery and avoiding production issues.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Data Extraction
    Data Scraping
    MongoDB
    Generative AI
    LLM Prompt Engineering
    PostgreSQL Programming
    Data Warehousing & ETL Software
    Big Data
    Amazon Web Services
    PySpark
    Databricks Platform
    Machine Learning
    Python
    Apache Hadoop
    SQL
    Talend Open Studio
    Data Migration
  • $32 hourly
     1.5 Years of working experience in IT industries.  Worked with different task such as ETL , Data Migration, Data Extraction, Data Cleaning, Data Processing, creating Data pipeline, Data Analysis using python, pandas, SQL Queries.  Experienced in creating data pipelines using python, pandas, Pyspark.  Experienced in various AWS services like AWS S3, RDS, AWS Glue, Kinesis , Athena, SNS, Redshift , Lambda , API , DynamoDB , GCP Cloud Storage , GCP Big-Query , Airflow , AWS Step function.  Experienced in creating, scheduling and monitoring workflows using Glue Workflow and deployment of the jobs.  Good knowledge of data analysis and workflow designing.  Experienced in using large-scale data processing frameworks like Pyspark .  Experienced in working with various Python Integrated Development Environments like PyCharm, JuPyter Notebook, Anaconda.  Team player and can work independently, individual contributor.  Quick learner for new technologies and applications.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    BigQuery
    Amazon S3
    AWS Glue
    AWS Lambda
    API
    PySpark
    MongoDB
    Amazon EC2
    Amazon Athena
    Amazon DynamoDB
    Apache Airflow
    ETL Pipeline
    SQL
    Python
  • $80 hourly
    SUMMARY __________________________________________________________________________ I am an experienced engineering leader with 25+ years of success architecting and delivering software products and solutions across diverse domains, including Marketing, Customer Experience, Finance, Logistics, Telecom, and Retail. I have helped organizations by leading teams that successfully deliver software products that enhance business results.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Software Architecture
    Cloud Architecture
    Jenkins
    IBM Unica Campaign
    Software
    SaaS
    Database
    Kubernetes
    Docker
    Apache Kafka
    AWS Application
    Java
  • $80 hourly
    I am a professional working as a data engineer having experience close to 3 years in IT industry, Having expertise in working for top clients in the Pharma domain and coming to technical background I have a vast experience in the big data engineering tools such as pyspark, Hadoop etc. I also have a experience in working on the DBMS and developed many projects from scratch in the domain of Big Data Pipe line using latest technologies.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    PySpark
    ETL
    EMR Data Entry
    Amazon S3
    Microsoft Excel
    SQL
    Python
  • $25 hourly
    10+ years of experience in full-stack and full-service data engineering/data visualization/data science/ AI-ML with enterprise clients such as Walmart, Procter & Gamble, Amazon, Johnson & Johnson etc. as well as SMEs like TheCreditPros, StructuredWeb, NorthCoastMedical, PoshPeanuts, EffectiveSpend, etc. Domain Experience: • Retail and E-commerce • Banking and Financial Services • Telecom • Sports & Gaming • Operations / ERP / CRM Analytics Tools Expertise: Data Engineering/ETL/Data Pipelines/Data Warehousing: • Talend Studio, Stitchdata, Denodo, Fivetran, CloverDX • AWS (Glue-RDS-Redshift-Step Functions-Lambda) • Azure (ADF-Data Lake-ADLS Gen2) • GCP (Cloud Composer- Cloud Functions-BigQuery) • SQL, MongoDB, DBT • ETL through Rest and SOAP APIs for Salesforce, Netsuite, Fulfil, Pardot, Facebook, Linkedin, Twitter, Instagram, Google Adwords, Yahoo Gemini, Bing Ads, Google Analytics, Zendesk, Mailchimp, Zoho, Five9 etc. • Data Streaming (Apache Spark, Flink, Flume, Kafka) • API/Webhook design through Python FastAPI + Uvicorn • Twilio, Asterisk Data Visualization/Business Intelligence: • Power BI (Pro-Premium-Embedded-Report Server, DAX/Power Query M) • Tableau (Prep-Cloud-Server-AI-Pulse, Functions/LOD Expressions) • Looker (Studio, Pro, LookML) • Qlik Sense • DOMO Voicebot-Chatbots: • NLP - LLM (Natural Language Processing - Large Language Models) • ChatGPT • Deepgram • Langchain • Llama2 • Falcon • deBerta • T5 • Bert Speech Engineering Tools/Techniques: • Kaldi / Speechbrain / Whisper / Nvidia Riva / EspNet / Bark • AWS-GCP-Azure ASR & TTS • Amazon AWS Polly, Transcribe, Translate • Automated Speech Recognition • Speaker Diarization • Wake Word Detection • Speech Biometrics • Intent Recognition • Speaker separation
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Talend Open Studio
    Snowflake
    BigQuery
    Qlik Sense
    dbt
    AWS Glue
    Microsoft Azure
    QlikView
    Automatic Speech Recognition
    SQL
    Tableau
    Apache Spark
    Microsoft Power BI
    Databricks Platform
  • $50 hourly
    I'm Akshay Aghade, a highly skilled and dedicated Data Engineer with over 3.1 years of experience in the IT industry. I'm passionate about leveraging my expertise in PySpark and cloud services, specifically Microsoft Azure and Amazon Web Services, to enhance data processing and analysis. My experience includes: Leading key projects, collaborating with cross-functional teams, and reporting directly to co-founders, demonstrating my level of responsibility and accountability. Proficiency in Azure, specializing in migrating data from Hana databases to Azure Data Lake Storage. Handling structured and unstructured data from diverse sources, using PySpark to write data extraction logic. A versatile background encompassing development, implementation, and support roles in various domains. Extensive knowledge of Microsoft Azure, covering tools such as Databricks, Data Factory, and Data Lake Storage, as well as familiarity with Amazon Web Services.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Scripting
    Amazon Web Services
    Database
    ETL Pipeline
    Data Lake
    Microsoft Azure
    AWS Lambda
    AWS Glue
    PySpark
    Adobe Spark
    SQL
    Python
  • $15 hourly
    Certainly! Here's a sample bio for someone with 10 years of experience in big data: With over a decade of hands-on experience in the field of big data, I am a seasoned professional dedicated to leveraging cutting-edge technologies to solve complex business challenges. Throughout my career, I have cultivated a deep understanding of various big data technologies and frameworks, including Hadoop, Spark, HBase, and Kafka.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    PySpark
    Akka
    Apache HBase
    REST API
    AWS Lambda
    Apache Airflow
    AWS Glue
    Scala
    Apache Spark
  • $10 hourly
    Detail-oriented Senior Data Engineer with 5+ years of experience designing, developing, and implementing data solutions for largescale businesses. Skilled in SQL, Python, Pyspark, Apache Spark, and Hadoop. Proven track record of improving data accuracy and reducing processing time through innovative data warehousing solutions.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Jira
    Jenkins
    Bitbucket
    Apache Airflow
    Snowflake
    Amazon S3
    Databricks Platform
    AWS Glue
    SQL
    Python
    PySpark
  • $20 hourly
    CLOUD ENGINEER AWS Cloud Engineer with 4 years of experience in Installation, Configuration, and Administration of Clusters. Proficient in handling various components such as EC2, VPC, IAM, Cloudwatch, AWS Glue, Redshift, Athena, Lake formation, S3, SNS. Proven ability to monitor, Load balancing, Auto Scaling troubleshoot, and optimize clusters for better performance and reliability with team. AWS as a cloud environment along with CDP. Expertise AWS (EC2, IAM, S3, VPC,RDS,SNS,SQS , Glue, Athena, Redshift etc) CDH,CDP (Upgradation,Migration) AWS | GCP | Azure Hadoop Components(HDFS, YARN, Hive, Spark, kafka, Mapred, EMR) Linux (Ubuntu, Centos, redhat, etc) Agile Methodology Kerberos, Bash Scripting, python script JIRA SQL Terraform Language English Hindi
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Docker
    Kubernetes
    Jira
    Jenkins
    CI/CD
    DevOps
    Architectural Design
    MySQL
    Amazon CloudWatch
    Amazon S3
    Amazon Athena
    Problem Solving
    AWS Glue
    Amazon Web Services
  • $40 hourly
    I have overall 10+ years of application development experience focused on data analytics, data warehousing, database administration, business analysis, preparing dashboards, back-end database development and data integration. He has been responsible for design and development of BI solutions using technologies like Microsoft BI Stack. During my tenure, I have worked with various domains, like US Health Insurance domain, Retail domain, Education domain, Pharma domain etc. Out of all of this, most of the time I spent on US Health care projects. I am pretty much comfortable with health-care terminologies. Main areas of expertise are: - Database design and development (MS SQL, MySQL, Snowflake); - Building BI solutions (SSAS Tabular Model, Kibana) - ETL process development (SSIS, Python, Snowflake); - Data Warehouses design and development; - OLAP cubes development; - Reports development (SQL Server Reporting Service, Power BI, Qlik Sense); - Python development engineer;
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Microsoft Excel
    Data Analysis Expressions
    Python
    Microsoft SQL Server Reporting Services
    SQL Server Integration Services
    Qlik Sense
    Microsoft Power BI
    Data Analytics
    Microsoft Azure
    Microsoft Azure SQL Database
    Snowflake
    Azure DevOps
    SQL
    Database
  • $15 hourly
    Experienced Business Intelligence Developer with a strong background in Python and SQL. Successfully developed reports and BI solutions for various e-commerce platforms including Shopify, WooCommerce, Magento, Amazon, and eBay, etc. Expertise includes working with Amazon Redshift for efficient data warehousing, query optimization, and data analysis. Proven track record in translating complex business requirements into actionable insights, driving innovation, and fostering a data-driven culture.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Analytics
    Python Script
    Python Scikit-Learn
    Data Analytics
    pandas
    SQL Programming
    SQL
    Python
    Business Intelligence
  • $28 hourly
    I’m a developer experienced in building Data Pipelines, Visualisations and Analytics for small and medium-sized businesses. Whether you’re trying to win work, analyse your work, finding out issues with your data, building end to end ETL solution with visualisation, I can help. Knows OLAP and OLTP, ETL, ELT, BigData, analytics and Visualisation Full Data Management from start to finish Regular communication is important to me, so let’s keep in touch.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    RabbitMQ
    Python
    Online Transaction Processing
    Online Analytical Processing
    Time Series Analysis
    BigQuery
    ETL Pipeline
    Apache Airflow
    Logstash
    Redis
    Apache Beam
    Tableau
    Metabase
    Data Analytics & Visualization Software
  • $15 hourly
    I am an experienced software developer having nearly 10 years of experience in IT industry on technologies like ETL informatica Powercenter tool, SQL database programming Oracle, IBM DB2, MySQL, AWS Redshift, PostgreSQL and Snowflake database, python programming language. I am officially Oracle certified as well as Snowflake certified. Also I worked in scheduling tools like Autosys, Automic, Control-M, and Maestro. Also I have basic experience in Unix scripting. I worked in various domains like banking and finance, Insurance, Healthcare, Airlines and retail store chain. I am so happy to work with clients in different domains and help them resolve their real time programming. I am also capable of helping the people in getting trained in the computer languages they are interested in. Regular communication is important to me, so let’s keep in touch.
    vsuc_fltilesrefresh_TrophyIcon Amazon Redshift
    Teradata
    Microsoft SQL Server
    PostgreSQL Programming
    IBM Db2
    Oracle
    Oracle PLSQL
    ETL
    Python Script
    SQL Programming
    SQL
    ETL Pipeline
    Python
    Snowflake
    Informatica
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Amazon Redshift Developer near Pune, on Upwork?

You can hire a Amazon Redshift Developer near Pune, on Upwork in four simple steps:

  • Create a job post tailored to your Amazon Redshift Developer project scope. We’ll walk you through the process step by step.
  • Browse top Amazon Redshift Developer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Amazon Redshift Developer profiles and interview.
  • Hire the right Amazon Redshift Developer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Amazon Redshift Developer?

Rates charged by Amazon Redshift Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Amazon Redshift Developer near Pune, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Amazon Redshift Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Amazon Redshift Developer team you need to succeed.

Can I hire a Amazon Redshift Developer near Pune, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Amazon Redshift Developer proposals within 24 hours of posting a job description.