Hire the best Hadoop Developers & Programmers in Brazil
Check out Hadoop Developers & Programmers in Brazil with the skills you need for your next job.
- $55 hourly
- 5.0/5
- (43 jobs)
I focus on data engineering, software engineering, ETL/ELT, SQL reporting, high-volume data flows, and development of robust APIs using Java and Scala. I prioritize three key elements: reliability, efficiency, and simplicity. I hold a Bachelor's degree in Information Systems from Pontifícia Universidade Católica do Rio Grande do Sul as well as graduate degrees in Software Engineering from Infnet/FGV and Data Science (Big Data) from IGTI. In addition to my academic qualifications I have acquired a set of certifications: - Databricks Certified Data Engineer Professional - AWS Certified Solutions Architect – Associate - Databricks Certified Associate Developer for Apache Spark 3.0 - AWS Certified Cloud Practitioner - Databricks Certified Data Engineer Associate - Academy Accreditation - Databricks Lakehouse Fundamentals - Microsoft Certified: Azure Data Engineer Associate - Microsoft Certified: DP-200 Implementing an Azure Data Solution - Microsoft Certified: DP-201 Designing an Azure Data Solution - Microsoft Certified: Azure Data Fundamentals - Microsoft Certified: Azure Fundamentals - Cloudera CCA Spark and Hadoop Developer - Oracle Certified Professional, Java SE 6 Programmer My professional journey has been marked by a deep involvement in the world of Big Data solutions. I've fine-tuned my skills with Apache Spark, Apache Flink, Hadoop, and a range of associated technologies such as HBase, Cassandra, MongoDB, Ignite, MapReduce, Apache Pig, Apache Crunch and RHadoop. Initially, I worked extensively with on-premise environments but over the past five years my focus has shifted predominantly to cloud based platforms. I've dedicated over two years to mastering Azure and I’m currently immersed in AWS. I have a great experience with Linux environments as well as strong knowledge in programming languages like Scala (8+ years) and Java (15+ years). In my earlier career phases, I had experience working with Java web applications and Java EE applications, primarily leveraging the WebLogic application server and databases like SQL Server, MySQL, and Oracle.Hadoop
ScalaApache SolrApache KafkaApache SparkBash ProgrammingElasticsearchJavaProgress ChefApache FlinkApache HBaseApache HadoopMapReduceMongoDBDocker - $45 hourly
- 5.0/5
- (3 jobs)
Hello! My name is Thales, I'm 23 years old and I currently work as a Data Engineer in an infrastructure and engineering cell, mainly aiming to develop automated solutions for business problems. An important moment in the formation of my personality was the experience of a year-long exchange, which I spent living with three different families in the city of Leavenworth, WA, United States, where I met countless people from the most diverse countries and cultures. I try to work in areas that involve the subject of Extraction, Transformation and Loading of data, among others, aiming to use process automation techniques and data-based studies to aid decision making. Contact: thalesaknunes22@gmail.com +55 11 98115-9558Hadoop
LinuxpandasOffice 365Machine LearningAlteryx, Inc.Looker StudioTableauPythonApache HadoopSQLData Engineering - $12 hourly
- 5.0/5
- (3 jobs)
Software Developer holding a bachelor's degree in Computer Science along with graduate studies in Big Data and Data Science. With over 12 years of extensive experience in software development, I've specialized in processing large volumes of data for the last 8 years, utilizing technologies such as Hadoop, Spark, and Scala.Hadoop
Progress ChefPythonSnowflakeDatabricks PlatformScalaDevOpsBig DataApache HadoopApache SparkJavaSQL - $30 hourly
- 0.0/5
- (0 jobs)
Data Engineer with 4+ years of experience in automation, data collection, data transformation, dashboard creation, and simple webpage creation. My Expertise: Data Cleaning | Manage databases | Automation: I have extensive experience in automating data pipelines and processes using Python, SQL, and Hadoop. I can help you streamline your data workflows and save time and resources. Data Collection: I can collect data from a variety of sources, including web APIs, databases, and files. I can also help you clean and prepare your data for analysis. Data Transformation: I can transform your data into the format you need for analysis and visualization. I can also help you create custom data pipelines to meet your specific needs. Dashboard Creation: I can create interactive dashboards to visualize your data and communicate your findings to stakeholders. I can use a variety of tools, including Tableau, Power BI, and Google Data Studi, Struggling with Complex Data Analysis, Plotly Dash. Webpage Creation: I can create webpages to display your data and findings. I can use HTML, CSS, and Bootstrap to create responsive and user-friendly webpages. Tables From PDF to csv or DB, PDF to Word,Excel,GoogleSheet,Image. My Skills: Database:MongoDB, DynamoDB, PostgerSQL, RDS, MySQL,Redshift, Oracle,SQL Server, Programming Languages: Python, SQL, R Big Data Tools: Hadoop, Spark, Hive, Pig Cloud Platforms: AWS, Azure, GCP Web Development: HTML, CSS, Bootstrap, JavaScript Data Visualization: Tableau, Power BI, Google Data StudioHadoop
Web ScrapingMicrosoft AzureGoogle Cloud PlatformAmazon Web ServicesChatGPT APIMicrosoft Power BIDashboardSQLAutomation AnywhereData ExtractionpandasDjangoApache HadoopSnowflakePython - $40 hourly
- 0.0/5
- (0 jobs)
Professional with extensive experience in Big Data, Data Engineering, Data Science, Data engineering, Data Modeling, Database, SQL, BI and Project Management. Big Data Analysis MBA from FIA - USP, an internationally recognized institution in the Financial Times ranking. With over 23 years of experience in information technology and several programming languages. Holder of the title of worldwide recognition by PMI (Project Management Institute) for project management PMP (Project Management Professional). Holder of 2 Post-Graduations in Project Management from PUC - RJ.Hadoop
Apache SparkGitHubBig DataApache HiveScrumJupyter NotebookApache HadoopApache AirflowBigQueryGoogle Cloud PlatformMicrosoft AzureDatabricks PlatformPySparkPythonSQL - $25 hourly
- 0.0/5
- (0 jobs)
Sou um profissional de Data & Analytics com experiência em engenharia de dados e análise de dados, atuando em grandes empresas como Itaú Unibanco e Azul Linhas Aéreas. Se você precisa de soluções para processamento de dados, construção de pipelines (ETL/ELT), migração para a nuvem, ou criação de dashboards e análises estratégicas, posso ajudar. Minhas habilidades incluem: Desenvolvimento e otimização de pipelines de dados utilizando ferramentas como Apache Spark, PySpark, AWS Glue, e AWS Athena. Migração de processos legados para a nuvem (AWS), com foco em eficiência e escalabilidade. Análise de dados com Python, SQL, e Power BI, gerando insights valiosos para tomada de decisões. Web scraping e automação de coleta de dados para enriquecer análises. Documentação e otimização de códigos, garantindo clareza e manutenibilidade. Trabalhei em projetos como: Construção e sustentação de pipelines de dados na AWS, utilizando S3, Glue, e Athena. Desenvolvimento de painéis interativos no Power BI para monitoramento de métricas e resultados. Migração de processos de prevenção de fraudes para a nuvem, com foco em Big Data e DataMesh. Acredito que a comunicação clara e frequente é essencial para o sucesso de qualquer projeto. Por isso, me comprometo a manter você atualizado em todas as etapas, desde o planejamento até a entrega final. Se você está buscando um profissional que combine tecnologia, inovação e resultados, vamos conversar! Estou pronto para ajudar sua empresa a transformar dados em decisões estratégicas.Hadoop
SQLPythonApache ImpalaApache HadoopAWS GlueApache SparkData AnalysisData ExtractionETLETL Pipeline - $15 hourly
- 0.0/5
- (0 jobs)
I'm Jean Pierre, I'm 20 years old and I currently work at F1rst technology. Over the time I have worked at F1rst I have developed skills in programming with Python, Pyspark, SQL, Hadoop and others data engineering technologies. Study at Zumbi University dos Palmares where I am currently studying Information Security. I intend to pursue a career in technology as a software engineer. data and seeking to constantly learn and evolve. We are working on a collaborative project, and I will be working alongside my classmates. Together, we form a multidisciplinary team with the following skills: Programming languages: MySQL, Python, PySpark, HTML, CSS, C#, Java, JavaScript, and Shell Script. Technologies: Databricks, Azure, Power BI, Git, GitHub, SQL, Pentaho, Excel, PowerPoint, and Linux. To learn more about each team member, visit their profiles: Gabriel Enare: [profile link] Enzo Coutinho: upwork.com/freelancers/~010534c53862be04fd?mp_source=share Marcos Vitor: upwork.com/freelancers/~012328ba13481defa4 We are available to answer any questions and collaborate on challenging projects. Best regards, Jean PierreHadoop
PySparkC#LinuxAPI DevelopmentWeb DevelopmentMicrosoft Power BIPythonR HadoopBig DataSQL ProgrammingSQLApache HadoopDatabase Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.