Hire the best Hadoop Developers & Programmers in Brazil

Check out Hadoop Developers & Programmers in Brazil with the skills you need for your next job.
  • $55 hourly
    I focus on data engineering, software engineering, ETL/ELT, SQL reporting, high-volume data flows, and development of robust APIs using Java and Scala. I prioritize three key elements: reliability, efficiency, and simplicity. I hold a Bachelor's degree in Information Systems from Pontifícia Universidade Católica do Rio Grande do Sul as well as graduate degrees in Software Engineering from Infnet/FGV and Data Science (Big Data) from IGTI. In addition to my academic qualifications I have acquired a set of certifications: - Databricks Certified Data Engineer Professional - AWS Certified Solutions Architect – Associate - Databricks Certified Associate Developer for Apache Spark 3.0 - AWS Certified Cloud Practitioner - Databricks Certified Data Engineer Associate - Academy Accreditation - Databricks Lakehouse Fundamentals - Microsoft Certified: Azure Data Engineer Associate - Microsoft Certified: DP-200 Implementing an Azure Data Solution - Microsoft Certified: DP-201 Designing an Azure Data Solution - Microsoft Certified: Azure Data Fundamentals - Microsoft Certified: Azure Fundamentals - Cloudera CCA Spark and Hadoop Developer - Oracle Certified Professional, Java SE 6 Programmer My professional journey has been marked by a deep involvement in the world of Big Data solutions. I've fine-tuned my skills with Apache Spark, Apache Flink, Hadoop, and a range of associated technologies such as HBase, Cassandra, MongoDB, Ignite, MapReduce, Apache Pig, Apache Crunch and RHadoop. Initially, I worked extensively with on-premise environments but over the past five years my focus has shifted predominantly to cloud based platforms. I've dedicated over two years to mastering Azure and I’m currently immersed in AWS. I have a great experience with Linux environments as well as strong knowledge in programming languages like Scala (8+ years) and Java (15+ years). In my earlier career phases, I had experience working with Java web applications and Java EE applications, primarily leveraging the WebLogic application server and databases like SQL Server, MySQL, and Oracle.
    Featured Skill Hadoop
    Scala
    Apache Solr
    Apache Kafka
    Apache Spark
    Bash Programming
    Elasticsearch
    Java
    Progress Chef
    Apache Flink
    Apache HBase
    Apache Hadoop
    MapReduce
    MongoDB
    Docker
  • $45 hourly
    Hello! My name is Thales, I'm 23 years old and I currently work as a Data Engineer in an infrastructure and engineering cell, mainly aiming to develop automated solutions for business problems. An important moment in the formation of my personality was the experience of a year-long exchange, which I spent living with three different families in the city of Leavenworth, WA, United States, where I met countless people from the most diverse countries and cultures. I try to work in areas that involve the subject of Extraction, Transformation and Loading of data, among others, aiming to use process automation techniques and data-based studies to aid decision making. Contact: thalesaknunes22@gmail.com +55 11 98115-9558
    Featured Skill Hadoop
    Linux
    pandas
    Office 365
    Machine Learning
    Alteryx, Inc.
    Looker Studio
    Tableau
    Python
    Apache Hadoop
    SQL
    Data Engineering
  • $12 hourly
    Software Developer holding a bachelor's degree in Computer Science along with graduate studies in Big Data and Data Science. With over 12 years of extensive experience in software development, I've specialized in processing large volumes of data for the last 8 years, utilizing technologies such as Hadoop, Spark, and Scala.
    Featured Skill Hadoop
    Progress Chef
    Python
    Snowflake
    Databricks Platform
    Scala
    DevOps
    Big Data
    Apache Hadoop
    Apache Spark
    Java
    SQL
  • $30 hourly
    Data Engineer with 4+ years of experience in automation, data collection, data transformation, dashboard creation, and simple webpage creation. My Expertise: Data Cleaning | Manage databases | Automation: I have extensive experience in automating data pipelines and processes using Python, SQL, and Hadoop. I can help you streamline your data workflows and save time and resources. Data Collection: I can collect data from a variety of sources, including web APIs, databases, and files. I can also help you clean and prepare your data for analysis. Data Transformation: I can transform your data into the format you need for analysis and visualization. I can also help you create custom data pipelines to meet your specific needs. Dashboard Creation: I can create interactive dashboards to visualize your data and communicate your findings to stakeholders. I can use a variety of tools, including Tableau, Power BI, and Google Data Studi, Struggling with Complex Data Analysis, Plotly Dash. Webpage Creation: I can create webpages to display your data and findings. I can use HTML, CSS, and Bootstrap to create responsive and user-friendly webpages. Tables From PDF to csv or DB, PDF to Word,Excel,GoogleSheet,Image. My Skills: Database:MongoDB, DynamoDB, PostgerSQL, RDS, MySQL,Redshift, Oracle,SQL Server, Programming Languages: Python, SQL, R Big Data Tools: Hadoop, Spark, Hive, Pig Cloud Platforms: AWS, Azure, GCP Web Development: HTML, CSS, Bootstrap, JavaScript Data Visualization: Tableau, Power BI, Google Data Studio
    Featured Skill Hadoop
    Web Scraping
    Microsoft Azure
    Google Cloud Platform
    Amazon Web Services
    ChatGPT API
    Microsoft Power BI
    Dashboard
    SQL
    Automation Anywhere
    Data Extraction
    pandas
    Django
    Apache Hadoop
    Snowflake
    Python
  • $40 hourly
    Professional with extensive experience in Big Data, Data Engineering, Data Science, Data engineering, Data Modeling, Database, SQL, BI and Project Management. Big Data Analysis MBA from FIA - USP, an internationally recognized institution in the Financial Times ranking. With over 23 years of experience in information technology and several programming languages. Holder of the title of worldwide recognition by PMI (Project Management Institute) for project management PMP (Project Management Professional). Holder of 2 Post-Graduations in Project Management from PUC - RJ.
    Featured Skill Hadoop
    Apache Spark
    GitHub
    Big Data
    Apache Hive
    Scrum
    Jupyter Notebook
    Apache Hadoop
    Apache Airflow
    BigQuery
    Google Cloud Platform
    Microsoft Azure
    Databricks Platform
    PySpark
    Python
    SQL
  • $25 hourly
    Sou um profissional de Data & Analytics com experiência em engenharia de dados e análise de dados, atuando em grandes empresas como Itaú Unibanco e Azul Linhas Aéreas. Se você precisa de soluções para processamento de dados, construção de pipelines (ETL/ELT), migração para a nuvem, ou criação de dashboards e análises estratégicas, posso ajudar. Minhas habilidades incluem: Desenvolvimento e otimização de pipelines de dados utilizando ferramentas como Apache Spark, PySpark, AWS Glue, e AWS Athena. Migração de processos legados para a nuvem (AWS), com foco em eficiência e escalabilidade. Análise de dados com Python, SQL, e Power BI, gerando insights valiosos para tomada de decisões. Web scraping e automação de coleta de dados para enriquecer análises. Documentação e otimização de códigos, garantindo clareza e manutenibilidade. Trabalhei em projetos como: Construção e sustentação de pipelines de dados na AWS, utilizando S3, Glue, e Athena. Desenvolvimento de painéis interativos no Power BI para monitoramento de métricas e resultados. Migração de processos de prevenção de fraudes para a nuvem, com foco em Big Data e DataMesh. Acredito que a comunicação clara e frequente é essencial para o sucesso de qualquer projeto. Por isso, me comprometo a manter você atualizado em todas as etapas, desde o planejamento até a entrega final. Se você está buscando um profissional que combine tecnologia, inovação e resultados, vamos conversar! Estou pronto para ajudar sua empresa a transformar dados em decisões estratégicas.
    Featured Skill Hadoop
    SQL
    Python
    Apache Impala
    Apache Hadoop
    AWS Glue
    Apache Spark
    Data Analysis
    Data Extraction
    ETL
    ETL Pipeline
  • $15 hourly
    I'm Jean Pierre, I'm 20 years old and I currently work at F1rst technology. Over the time I have worked at F1rst I have developed skills in programming with Python, Pyspark, SQL, Hadoop and others data engineering technologies. Study at Zumbi University dos Palmares where I am currently studying Information Security. I intend to pursue a career in technology as a software engineer. data and seeking to constantly learn and evolve. We are working on a collaborative project, and I will be working alongside my classmates. Together, we form a multidisciplinary team with the following skills: Programming languages: MySQL, Python, PySpark, HTML, CSS, C#, Java, JavaScript, and Shell Script. Technologies: Databricks, Azure, Power BI, Git, GitHub, SQL, Pentaho, Excel, PowerPoint, and Linux. To learn more about each team member, visit their profiles: Gabriel Enare: [profile link] Enzo Coutinho: upwork.com/freelancers/~010534c53862be04fd?mp_source=share Marcos Vitor: upwork.com/freelancers/~012328ba13481defa4 We are available to answer any questions and collaborate on challenging projects. Best regards, Jean Pierre
    Featured Skill Hadoop
    PySpark
    C#
    Linux
    API Development
    Web Development
    Microsoft Power BI
    Python
    R Hadoop
    Big Data
    SQL Programming
    SQL
    Apache Hadoop
    Database
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.