Hire the best Big Data Engineers

Check out Big Data Engineers with the skills you need for your next job.
Clients rate Big Data Engineers
Rating is 4.8 out of 5.
4.8/5
based on 1,697 client reviews
  • US$60 hourly
    ✅ AWS Certified Solutions Architect ✅ Google Cloud Certified Professional Data Engineer ✅ SnowPro Core Certified Individual ✅ Upwork Certified Top Rated Professional Plus ✅ The author of Python package for cryptocurrency market Currency.com (python-currencycom) Specializing in Business Intelligence Development, ETL Development, and API Development with Python, Apache Spark, SQL, Airflow, Snowflake, Amazon Redshift, GCP, and AWS. Accomplished lots of complicated and not very projects like: ✪ Highly scalable distributed applications for real-time analytics ✪ Designing data Warehouse and developing ETL Pipelines for multiple mobile apps ✪ Cost optimization for existing cloud infrastructure But the main point: I have a responsibility for the final result.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Data Scraping
    Snowflake
    ETL
    BigQuery
    Amazon Redshift
    Data Engineering
    Cloud Architecture
    Google Cloud Platform
    ETL Pipeline
    Python
    Amazon Web Services
    Apache Airflow
    SQL
    Apache Spark
  • US$20 hourly
    Learning is also a kind of earning. So, I learned Web analytics and completed 200+ projects on Fiverr and Upwork. Then I fell in love with Data Science and learned its all topics by heart along with a master degree in Data Science :) The future belongs to those who learn more and more skills so I believe that when we stop learning, we start dying. However, my existing skills along with a short description are given below. Web Analytics: Certified, experienced and best at web analytics using Matomo and Google Analytics. Web Development: PHP frameworks are my home turf and I have completed so many projects in Opencart, Codeigniter, Laravel, and WordPress. Please have a look at the portfolio to see the projects. Web scraping & crawling: Thanks to the Search Engine development project for which I crawled & scraped 1 million web documents :P so after that I always see myself doing scraping, crawling, and automation-related projects. For automation, I use selenium, beautiful soup, and Python. APIs development: As Python is in my blood so before data science, I used to work on APIs development and web scraping using Python. Big data: I have rich experience in the Hadoop eco-system using Apache Hadoop, Apache Hbase, Apache Nutch, Apache Hive, Apache Solr, etc. The detail is given below. ** Apache Hbase: How to store and retrieve data from Hbase and how to use it with Hadoop and Apache Nutch. ** Apache Nutch: How do search engines work? How to develop your own search engine using Apache Nutch as a crawler, Customization of Nutch at all phases, Tuning of Nutch for effective crawl, and a lot more. ** Apache Solr: How to use Solr for indexing purposes, How to use it in distributed mode, how to do indexing, how to analyze indexed data using different Solr clients. Matomo analytics: There was a time when I was the only Matomo related developer on Fiverr and Upwork. I have very rich experience in following tasks. ** Matomo Installation, configuration, and integration with the website. ** Matomo customization like custom dashboard development i.e., change look & feel different and you can claim that it is your custom product. ** Custom event tracking with/without Matomo tag manager ** Custom variables/dimensions, Goals, and conversions tracking. ** Conversion rate, custom reporting, and GeoIP-based correct location tracking. ** Expert to fetch tracking data through Matomo API to develop a custom dashboard or to display insights on another website.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    RESTful API
    Machine Learning
    Data Analysis
    Google Analytics
    Database Design
    Web Analytics
    MySQL
    DApps
    Piwik PRO
    ETL Pipeline
    Python
    Microsoft Excel
    Data Scraping
    Data Mining
  • US$38 hourly
    💡 If you want to turn data into actionable insights or planning to use 5 V's of big data or if you want to turn your idea into a complete web product... I can help. 👋 Hi. My name is Prashant and I'm a Computer Engineer. 💡 My true passion is creating robust, scalable, and cost-effective solutions using mainly Java, Open source technologies. 💡During the last 11 years, I have worked with, 💽Big Data______🔍Searching____☁️Cloud services 📍 Apache Spark_📍ElasticSearch_📍AWS EMR 📍 Hadoop______📍Logstash_____📍AWS S3 📍 HBase_______📍Kibana_______📍AWS EC2 📍 Hive_________📍Lucene______ 📍AWS RDS 📍 Impala_______📍Apache Solr__📍AWS ElasticSearch 📍 Flume_______📍Filebeat______📍AWS Lambda 📍 Sqoop_______📍Winlogbeat___📍AWS Redshift 5-step Approach 👣 Requirements Discussion + Prototyping + Visual Design + Backend Development + Support = Success! Usually, we customize that process depending on the project's needs and final goals. How to start? 🏁 Every product requires a clear roadmap and meaningful discussion to keep everything in check. But first, we need to understand your needs. Let’s talk! 💯 Working with me, you will receive a modern good looking application that will meet all guidelines with easy navigation, and of course, you will have unlimited revisions until you are 100% satisfied with the result. Keywords that you can use to find me: Java Developer, ElasticSearch Developer, Big Data Developer, Team lead for Big Data application, Corporate, IT, Tech, Technology.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    ETL
    Data Visualization
    Amazon Web Services
    SQL
    Amazon EC2
    ETL Pipeline
    Data Integration
    Data Migration
    Logstash
    Apache Kafka
    Elasticsearch
    Apache Hadoop
    Apache Spark
    Core Java
  • US$30 hourly
    My primary interest is working with data and building solutions to complex data problems. I can help with Data wrangling and modeling. I can help Perform custom ETL jobs and build data pipelines via Pyspark, AWS Glue, etc. With my rich data visualization and dashboarding experience, I can help develop insightful dashboards with robust KPI metrics, which are critical to business decision-making using R-shiny, Tableau, and PowerBI. In addition, I can help with related statistical tasks using Excel and R. I can help with building complex optimization models that help address various logistics and scheduling problems.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Data Modeling
    Dashboard
    A/B Testing
    Data Analysis
    Microsoft Power BI
    RStudio
    Data Visualization
    Microsoft Excel
    SQL
    Tableau
    Mathematics
    Linear Regression
    Python
  • US$40 hourly
    Briefly describing myself, I’m a self-motivated, dedicated and enthusiastic Python programmer with 2 years of production experience, creating ETL process, software architecture and functionality, deploying apps on GCP etc. My purposefulness and eagerness to improve hard and soft skills help me to become familiar with new technologies and difficulties. As a result, I want to take place in real projects and be useful there as much as I can.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Data Migration
    MongoDB
    G-Cloud
    ETL Pipeline
    Python
    BigQuery
    SQL
    PostgreSQL
    ETL
  • US$60 hourly
    I have completed 59+ jobs, earned $200K and have a stellar 5/5 rating on Upwork. My experience represents countless hours spent mastering skills and solving complex problems, ensuring you don't have to navigate these challenges yourself. Hire me if you: ✅ Want a SWE with strong technical skills ✅ Need a Python, Go, or a Rust developer ✅ Seek to leverage AI for predictive analytics, enhancing data-driven decision-making ✅ Require AI-based optimization of existing software for efficiency and scalability ✅ Wish to integrate AI and machine learning models to automate tasks and processes ✅ Need expert guidance in selecting and implementing the right AI technologies for your project ✅ Desire a detail-oriented person who asks questions and figures out things on his own ✅ Even have a requirement in your mind but are not able to craft it into a technical format ✅ Want advice on what tools or tech you want to implement in your next big project ✅ Are stuck in a data modeling problem and need a solution architect ✅ Want to optimize a data pipeline Don't hire me if you: ❌ Have a huge project that needs to be done overnight ❌ Have academic work to be done About me: ⭐️ A data engineer with proven experience in designing and implementing big data solutions ⭐️ Skilled in integrating AI technologies to solve complex problems, improve efficiency, and innovate within projects ⭐️ A Go developer specialized in creating microservices ⭐️ A certified Data Engineer on AWS technologies ⭐️ Will optimize your code in every single commit without even mentioning or charging extra hours ⭐️ Diverse experience with start-ups and enterprises taught me how to work under pressure yet work professionally
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Web Scraping
    Microservice
    ETL Pipeline
    Apache Spark
    AI Bot
    OpenAI API
    Artificial Intelligence
    Generative AI
    Large Language Model
    Golang
    Python
  • US$35 hourly
    I am Big data engineer / Scala developer with 16 years of experience with the full software development lifecycle. My primary languages are Scala and Java, with experience in development frameworks such as Play, Akka, Slick. Database competencies include RDBMS (PostgreSQL, MySQL) and Big Data (Spark, Hadoop, HBase, Sqoop, Pig, Hive, Vertica, Kafka). Expertise in working throughout the projects full-lifecycle, had worked on every phase. Experience in Agile delivery of software using practices from Scrum, Kanban etc. I am very keen to learn new technologies and business knowledge. I accept challenges and can work independently without a close assistance. For employment prospective I am open to contract role and on negotiable rates. I can make assure my stability with company and available immediately. Buzzwords: Scala, Hadoop, HBase, Hive, Zookeper, Vertica, Redis, hand-made DB with offheap storage, Play Framework, Slick, Akka, Java SE, Java EE, Spring, EJB, Hibernate, MyBatis, Vaadin, Struts, JSTL, JSP. PostgreSQL, PL/SQL, MySql Javascript, Prototype, Dojo, ExtJs Jboss, Tomcat Intellij IDEA, Eclipse
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Apache HBase
    Vertica
    Akka
    Java
    Scala
    Apache Spark
    Apache Hive
    PostgreSQL
    Apache Hadoop
  • US$50 hourly
    Big Data Consultant, ETL developer, Data Engineer providing complete Big Data solutions, including data acquisition, storage, transformation, and analysis. Design, implement and deploy ETL Pipelines. I have degree in Computer System Engineering and 3 Years of experience in Data Driven Tasks specifically in Big Data Solutions. Additionally, having business exposure of banking sector and ecommerce platform. ------------- Data Engineering Skills ------------- Expertise: Hadoop, HDFS, Unix and Linux , Data Warehousing, Data Integration, Data Reconciliation, Data Consolidation, Dimensional Modeling, Shell Scripting, Web Scraping. Tools & Libraries: Advance SQL, Spark, Scala, Python, Hadoop, Hive, SSIS, Sqoop, Hive, Impala, AWS, Advance Excel, Redshift. Database: Oracle, MS SQL Server, SQLite
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Automation
    Database Programming
    SQL Server Integration Services
    MySQL
    Amazon Web Services
    ETL
    Amazon Redshift
    Data Analysis
    Amazon S3
    SQL
    Python
    ETL Pipeline
    Data Integration
    Apache Spark
  • US$40 hourly
    Greetings, With over 7 years of experience in both technical and commercial domains, I have gained valuable expertise in diverse areas such as web development, software architecture, product design, DevOps engineering, and infrastructure management. My journey has led me to master numerous modern frameworks and libraries. I am deeply inclined towards fostering creativity and embracing continuous improvement in my development endeavors. Throughout my career, I've had the privilege of being part of some remarkable projects that have contributed to my growth and reputation. These experiences have not only added to my skill set but also instilled a sense of accomplishment. I am eager to share my advanced knowledge and insights with you, with the aim of assisting you in achieving your goals and driving your success. I'm eagerly looking forward to the opportunity of connecting with you soon. Regards, Viknesh Subramanian
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Data Warehousing & ETL Software
    Data Warehousing
    Data Visualization
    ETL Pipeline
    Business Intelligence
    RESTful API
    GraphQL
    Serverless Computing
    Cloud Computing
    TypeScript
    Infrastructure as Code
    Amazon Web Services
    Python
    Node.js
  • US$25 hourly
    Hello there! I'm an experienced Cloud Data Engineer and Data Architect with 8 years of industry expertise. I specialize in leveraging cloud technologies to design robust data solutions that drive efficiency and unlock valuable insights. Let's collaborate to transform your data into a strategic asset for your business.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Python Script
    Data Ingestion
    Cloud Computing
    Data Extraction
    Data Warehousing
    Data Lake
    Microsoft Azure
    PySpark
    Data Migration
    Apache Spark
    Databricks Platform
    Data Engineering
    SQL
    ETL Pipeline
  • US$44 hourly
    If you're considering a development project, talk to our team. We'll walk you through a brainstorming session where we can scope your project and explain how ... _Entrepreneurs: Have a big idea but don't have an engineering team in place already? Let us be your one-stop-shop. We'll work together with you to evaluate your needs, develop a game-plan and create a product you can be proud of. _Startups: Need to augment your team with highly specialized members who can integrate seamlessly with you? We can provide an extremely talented, focused team that will work alongside you for incredible results _Enterprises: Can't find the internal resources for a new project or request? Let us help. We understand the scalability and security concerns that come with an enterprise system. -Experience in coding/development of front-end as well as back-end technologies. -Expertise in designing and developing applications using MVC, ASP.NET, C#, VB.NET, SQL SERVER 2005/2008/2012, HTML 4/5, Java Script, Win Forms, Entity Framework & Angular.JS. Good experience in XML and XSLT transformation. -Expertise in using Language Integrated Query (LINQ) for data manipulation. Good experience in development of reports using Crystal Reports, SSRS, and Data Reports. -Experience in the complete life cycle of project development (SDLC) including System Analysis, Design, Development, Testing and Deployment. -Proficient in Model-View-Controller (MVC) architectures using Rails framework. -Pervasive experience in implementing agile development methodology and scrum for the project. -Developed well-tested, readable, reusable UI interface using Ruby, JavaScript, HTML and CSS on both Windows and Linux systems. -Experience in implementing Rails Migrations
    vsuc_fltilesrefresh_TrophyIcon Big Data
    iOS Development
    Android
    AWS IoT Core
    Swift
    Node.js
    Ruby on Rails
    Machine Learning
    React
    OpenCV
    GPT-3
    ArcGIS
    AngularJS
    React Native
    C++
    Java
  • US$55 hourly
    I have more than seven years of hands-on experience in data engineering. My specialities are building data platforms and data pipelines with different sources. I'm keen to work on an end-to-end data pipeline building on AWS or GCP. I can fix your time and resource-killing data pipeline issues. Share your gig. Feel the difference. Also I have an expertise in: - Full Stack Web Application Development / Database Design / API development & Integration - DevOps/Linux Server Administration/Deployment/Migrations/Hosting - Web Automation / Scraping / Crawlers / Bots
    vsuc_fltilesrefresh_TrophyIcon Big Data
    PySpark
    API
    AWS Lambda
    Amazon Web Services
    ETL Pipeline
    Apache Spark
    Python
    Scrapy
    Amazon S3
    Data Mining
    AWS Glue
    Apache Airflow
    DevOps
    Docker
    Data Migration
  • US$89 hourly
    I have bandwidth for further development, training, consultancy and mentoring work. Kieran has passed 20+ Microsoft data-focused exams, many of which are Azure focused, link. This includes current Azure certification in Fabric Analytics, Data Analysis, Data Engineering, Enterprise Data Analysis, Database Administration and Administration. Kieran is one of the most endorsed and recommended Microsoft Data Platform Consultants within the UK. He is an established trainer, mentor and public speaker on Power BI and the Microsoft Data Platform stack. Kieran continually invests time and resources to research into further optimising the way our customers can increase their business insights using their own data. This is while Kieran strives to spend each day delivering pragmatic solutions catering for the customer’s immediate needs. Kieran is also an expert in creating data analytics solutions which provide advanced data analytics based on multi-billion row record sets.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Microsoft Excel PowerPivot
    SQL Server Integration Services
    Power Query
    Data Modeling
    Microsoft Certified Professional
    Microsoft Power BI Data Visualization
    Microsoft Azure SQL Database
    Data Analysis
    Microsoft Power BI
    Business Intelligence
    Microsoft SQL Server Reporting Services
    Microsoft Power BI Development
    Databricks Platform
    Azure DevOps
    Microsoft Azure
    Transact-SQL
    SQL
  • US$25 hourly
    I am a Top-Rated Plus Freelancer with over 5 years of experience in Data Science and Data Engineering. I specialize in Python development, ETL processes, API implementation, data pipelines, data warehouses/lakes, and database optimization. I have successfully delivered projects for various industries, including telecommunications, real estate, fintech, and healthcare. Key Skills Programming: Python, Node.js Databases: SQL (MySQL, PostgreSQL), NoSQL (MongoDB) ETL Tools: Apache Airflow, Pentaho Data Integration Cloud Platforms: AWS (Athena, Lambda, Glue, S3), GCP (BigQuery, Cloud SQL, Compute Engine, Cloud Run) Visualization: Power BI, Amazon QuickSight, Databox, Grafana Why Choose Me? Proven Expertise: Over 5 years of data science and engineering experience. Comprehensive Skill Set: Proficient in various data engineering tools and technologies. Client Satisfaction: Top-rated for delivering high-quality work on time. Professional Approach: Strong analytical skills and a team player. Notable Projects Telecommunications Data Warehouse (Telecommunications) It was built and optimized data warehouses for efficient data storage and retrieval. Real Estate Data Integration (Real Estate) Developed pipelines for seamless data integration and reporting. Fintech Data Management (Fintech) Managed transactional data, ensuring accuracy and compliance. Healthcare Data Management (Healthcare) Handled large healthcare datasets and generated detailed SQL-based reports. Automated Facebook Ads Creation (Digital Marketing) Automated process of creating buyer catalogues and product sets using Facebook marketing API and Pentaho Data Integration.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Amazon DynamoDB
    Marketing Analytics
    Business Intelligence
    Data Scraping
    Data Analytics
    API
    Google Cloud Platform
    Amazon Web Services
    Data Extraction
    Analytics
    SQL
    Data Integration
    Python
    Data Engineering
  • US$40 hourly
    **Summary** - Top Rated Plus on Upwork with 100% Job Success Rate - Successfully completed 28+ projects, earning $50K+ in revenue - 18+ years of experience in full-stack web development, mobile application development, data engineering, and business intelligence - Featured certifications from Microsoft, IBM, and Oracle **Services / Focus** - **Full-Stack Development:** MERN stack (ReactJS, Node.js, Express), Python, and custom CRM systems - **Mobile Application Development:** React Native for cross-platform mobile apps (iOS/Android) - **Data Engineering:** Google BigQuery, Snowflake, AWS Redshift, Azure Synapse, and ETL pipelines - **Business Intelligence:** MS Power BI, Looker Studio, Apache Superset, and OBIEE - **DevOps / Cloud Infrastructure:** AWS, Google Cloud, Heroku, and Terraform - **Database Design and Management:** PostgreSQL, MySQL, SQL Server, and Oracle I specialize in building scalable, data-intensive applications that drive business growth, along with robust and user-friendly mobile applications. My approach is client-centric, ensuring clear communication, well-defined milestones, and solutions tailored to your unique needs. I am passionate about staying at the forefront of technology and applying my expertise to solve complex challenges. **Key Competencies:** - **Full-Stack Development:** Expertise in ReactJS, Node.js, ExpressJS, and Python for both web and mobile applications. - **Mobile Application Development:** Proficient in developing cross-platform mobile apps using React Native, delivering high-performance and visually appealing mobile solutions. - **Data Engineering:** Skilled in designing and implementing data pipelines and warehouses, transforming raw data into actionable insights. - **API Design and Integration:** Experienced in creating and managing RESTful APIs with a focus on security, scalability, and maintainability. - **Agile Development & Project Management:** Adept at managing projects with agile methodologies, ensuring timely delivery aligned with client goals. **Professional Philosophy:** I am committed to leveraging technology to solve complex problems and deliver user-centric solutions. I approach each project with a focus on understanding the unique challenges and providing solutions that meet both immediate and long-term needs. **Recent Niche Projects:** - **Data-Driven CRM Development:** Integrated QuickBooks and Fishbowl sales data into a custom CRM tailored to specific sales needs. - **Interactive Rates Dashboard:** Developed a ReactJS-based data dashboard with functionalities for PDF and Excel downloads. - **Mobile App Development:** Created cross-platform mobile applications using React Native, enhancing user engagement and expanding business reach. - **BigQuery Integration:** Designed and implemented ETL pipelines for real-time data integration between Firebase and Google BigQuery. **Call to Action:** Let’s connect to discuss how I can help transform your ideas into reality or optimize your current solutions for technological excellence and operational efficiency. **Certifications:** - Microsoft Azure Databricks for Data Engineering - IBM's Front-End Development with React - Oracle Business Intelligence Foundation Suite 11g Certified Implementation Specialist **Contact Me Today**
    vsuc_fltilesrefresh_TrophyIcon Big Data
    MERN Stack
    MongoDB
    Full-Stack Development
    HTML
    JavaScript
    CSS
    Analytics
    Node.js
    ExpressJS
    ActionScript
    Redux
    React Native
    React
    Python
  • US$90 hourly
    With 19 years of experience, I have honed my skills in developing robust Excel and Access programs using VBA. Throughout my career, I have successfully delivered tailored solutions to a diverse range of clients, including individuals and Fortune 500 companies. My expertise lies in creating sophisticated, user-friendly, and intuitive Excel and Access programs. I have a proven track record of automating tedious and complex processes, empowering businesses and individuals to enhance their productivity. Whether it involves building programs from scratch or expanding upon existing foundations, I am passionate about leveraging automation to streamline workflows and maximize efficiency. I am actively seeking opportunities to apply my expertise and create customized solutions that address specific needs. Let's collaborate to transform your Excel and Access processes into powerful and user-friendly automated systems, enabling you and your business to achieve higher levels of productivity.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Microsoft Access
    Microsoft Access Programming
    Spreadsheet Software
    Visual Basic
    Microsoft Excel
    Macro Programming
    Business Process Automation
  • US$40 hourly
    I am an experienced and proficient web scraper using Python to obtain very large amounts of data from a variety of online sources. I do fixed-price work and have successfully pulled data from 100's of sites with examples being business locations, directories, public information, IMDB movie info, sports-reference stats, music charts, Forbes company rankings/info, ESPN player pages, Google search results, as well as hundreds of other queries of all genres. You can see some of my results via the data sets which are used on my big data quiz site, hugequiz.com. I have been able to retrieve data from articles, tables, lists, recursively via search results, from sites with AJAX/Javascript, and even when authentication is required. Any project you have I would be able to discuss and preview the site(s) which need to be scraped in order to provide you the output you are looking for in .CSV or other format.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Data Scraping
    Python
    Microsoft Excel
  • US$55 hourly
    I am an entrepreneur and consultant for over 18 years with expertise in following platforms Analytics: GA4, Mixpanel, Amplitude, Heap Analytics CDP / Pipelines: Segment, Rudderstack, mParticle Visualization: Looker Studio, Looker Enterprise I help digital products set up comprehensive analytics systems. I work closely with teams and map their KPIs onto these tools to make it most effective. I have created a complete analytics suite that can easily scale to over 10 Billion events per month. If you have goals like - Analytics Implementation - GA4 Migration/setup - Conversion analysis - Retention analysis - Attribution analysis - Feature adoption and friction analysis - Goal events for users like attaching a card etc. - New feature release impact - A/B testing I can help with - Analytics Integration - KPI definitions and data tracking - CDP implementation with Segment and Rudderstack - Visualization with tools like looker, Looker Data Studio, Klipfolio etc. - Connectors like Stitch, Fivetran. - Setting up reports on Amplitude, Mix-panel, Fibotalk, and GA4 - Implementation Architecture
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Marketing Analytics
    Amplitude
    Product Analytics
    Data Extraction
    Data Segmentation
    Google Analytics API
    ETL
    Mixpanel
    Google Analytics
  • US$35 hourly
    ***********************I shall either find a way or make it one******************** I'm an Engineer in Operational Research , I've got many skills and expertise that will allow me to achieve perfectly all the projects and missions. I have advanced knowledge's in: Vba Excel Optimization Queuing Theory Linear Programming Economics modeling, Data Mining: Factor analysis, Principal Component Analysis, Regression (Simple, Multiple, logistic, hierarchical, Poisson), Anova, Clustering, Hierarchical Clustering... I master Lingo, Ampl, Spss, Stata, R, Eviews, I'll provide you perfect reporting using MsWord. Also I have a huge experience / Knowledge with creating software for scraping/extracting data from web site with VBNET
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Marketing Analytics
    Machine Learning
    API Integration
    Django
    Stata
    R Shiny
    Statistics
    Statistical Programming
    Data Scraping
    Python
    R
    pandas
    Data Mining
  • US$50 hourly
    Salut! 👋🏻 I'm a passionate AI engineer who is eager to discover new technologies. - I've used Keras and TensorFlow extensively in many projects, including GANs/CNN/RNN/QNN/LSTM/autoencoders. - Along with NLP techniques such as Word embedding/BOW/POS/Sentiment analysis/Regex/NER with libraries including: (Glovo, Word2Vec, spaCy, and Beautiful soup). - I've also worked with computer vision techniques such as Image segmentation/classification/enhancement with OpenCV, and Pillow libraries. Finally, I enjoy working in creative environments with realistic pressure conditions.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Cloud Computing
    Image Processing
    Machine Learning
    Deep Learning Modeling
    Model Optimization
    Computer Vision
    MATLAB
    Natural Language Processing
    C++
    TensorFlow
    Keras
    Deep Learning
    Python
  • US$28 hourly
    I am an experienced Elasticsearch engineer/ELK Admin with 5 years of experience working with the ELK Stack (Elasticsearch, Kibana, Logstash) +(Beats, Enterprise Search) on Linux-based systems. My expertise includes designing, building, and maintaining scalable and high-performing Elasticsearch clusters for a variety of use cases, including search, analytics, and data visualization. Elasticsearch: -Capacity planning (master nodes, data nodes, …) -Scaling Elasticsearch: Horizontally and Vertically -Performance optimization: Memory, CPU, Disk and Network -High Availability and Fault Tolerance -Backup and Restore (shared file system, AWS S3) -Security: Authentication, Authorization and Encryption -Monitoring and Logging -Cluster Management -Rolling Upgrades and Updating -ILM Lifecycle (Hot, Warm, Cold, Delete Phases) -Scaling down resources -Searching and Querying in Elasticsearch -Aggregations: Grouping, Sorting, and Analyzing data -Geo and spatial search -Big data analytics -Real time analytics -Configuring OpenSearch -Graylog -Kafka/RabbitMQ Logstash: -Transferring JSON files -Transferring CSV files -Transferring data from MySQL, MongoDB, SQL Server -Configuring data pipleline from Kafka, RabbitMQ -Grok patterns Kibana: -Realtime reports -Kibana Lens
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Linux System Administration
    Logstash
    Visualization
    CentOS
    Amazon S3
    ELK Stack
    Article Writing
    Linux
    Elasticsearch
    Red Hat Enterprise Linux
    Technical Writing
    Ubuntu
    Kibana
  • US$40 hourly
    🔍🚀 Welcome to a world of data-driven excellence! 🌐📊 Greetings, fellow professionals! I am thrilled to introduce myself as a dedicated Data Consultant / Engineer, leveraging years of honed expertise across a diverse spectrum of data stacks 🌍. My journey has been enriched by a wealth of experience, empowering me with a comprehensive skill set that spans Warehousing📦, ETL⚙, Analytics📈, and Cloud Services☁. Having earned the esteemed title of GCP Certified Professional Data Engineer 🛠, I am your partner in navigating the complex data landscape. My mission is to unearth actionable insights from raw data, shaping it into a strategic asset that fuels growth and innovation. With a deep-rooted passion for transforming data into valuable solutions, I am committed to crafting intelligent strategies that empower businesses to flourish. Let's embark on a collaborative journey to unlock the full potential of your data. Whether it's architecting robust data pipelines ⛓, optimizing storage solutions 🗃, or designing analytics frameworks 📊, I am dedicated to delivering excellence that transcends expectations. Reach out to me, and together, let's sculpt a future where data powers success. Thanks!
    vsuc_fltilesrefresh_TrophyIcon Big Data
    PySpark
    Machine Learning
    Natural Language Processing
    Informatica
    Data Science
    Data Warehousing
    Snowflake
    Data Analysis
    BigQuery
    ETL
    Apache Airflow
    Apache Hadoop
    Apache Spark
    Databricks Platform
    Python
    Apache Hive
  • US$60 hourly
    Experienced in Golang, K8s, Distributed system design, AWS, GCP, Clouds. My specialisation: design, develop and implement high-performance scalable software, apply cloud (AWS, GCP) computing skills for deployment. I’m a Software Engineer with over 5+ years hands-on expertise in developing software from scratch and advancing legacy systems using Golang. During that time I’ve participated in several projects of various sizes and complexity, worked with a lot of awesome Cloud-Native things which gave me experience in different areas of software and infrastructure architecture and development. My duties included: requirements analysis and technical consultancy, architecture and performance improvement, issues detecting and solving, developing new features and improving existing ones, client support. I have experience and a strong understanding of OOP concepts, OO design patterns, microservices architecture. My skills: Golang, Docker, Kubernetes, Helm, InfluxDB, Elasticsearch, ScyllaDB, PostgreSQL, MongoDB, AWS cloud services, Kafka, EMQ, RabbitMQ, Serverless, HTTP, gRPC, Grafana, CICD, chef, terragorm, FaaS. Skills at a glance --------------------- ⦁ Language -- Go / Javascript / Node ⦁ Dev Framework -- Gin / Echo/ Beego / Iris ⦁ Test Framework -- Gotest / Mocha / Jest ⦁ Database -- Cassandra / ScyllaDB / InfluxDB / Elasticsearch / MongoDB / Redis / PostgreSQL / MySql ⦁ Coverage Tool -- Testify / GoCover ⦁ Visualization Tools -- Kibana / Grafana / Chornograf ⦁ Message Queues -- Kafka / EMQX / RabbitMQ ⦁ Logging Tools -- Logstash / Fluentd / Statsd / Sentry ⦁ Orchestrators -- Docker / Kubernetes / Helm / Docker Swarm ⦁ Continuous Integration -- GitLab CI / Travis CI / Circle CI / Dron CI ⦁ Monitoring Tools -- Prometheus / Grafana / ELK Stack ⦁ Iaas/Paas -- AWS / GCP / Google Pub-Sub ⦁ Favorite Text Editor -- VS Code ♥ ⦁ Emerging Tech -- Smart contract, Rust
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Database Architecture
    Azure DevOps
    Amazon Web Services
    Database Administration
    Software Architecture & Design
    Google Cloud Platform
    DevOps
    React
    Data Analytics
    Kubernetes
    Docker
    Node.js
    Blockchain
    Golang
    Smart Contract
  • US$100 hourly
    - 5+ years of DevOps/Cloud experience - Over 40 satisfied clients with a 97% job success rate and Top Rated Plus in Upwork - Certified Kubernetes Application Developer and AWS Certified Developer - Certified Google Cloud Architect - Lead DevOps/Cloud Engineer - 7+ years of experience in IT - Conduct public webinars and workshops An enthusiastic DevOps and Cloud Engineer having extensive experience in Containerization, Continuous Integration, Continuous Deployment, Infrastructure As Code, and Monitoring Solutions. I am a Certified Kubernetes Application Developer and a Certified AWS Developer Associate with experience in implementing automation in different phases of the Software Development Life Cycle. I also possess experience in infrastructure in the cloud and deployment of applications in on-premise and cloud servers. I have also been conducting public webinars and workshops I work on the following technologies : Clouds ===== AWS, GCP, AZURE, and Digitial Ocean Containerization ============ Docker,Kubernetes, docker-compose, docker swarm, GKE, AKS,EKS, Helm,Openshift CI/CD ===== Github Actions,Gitlab CI, Bitbucket pipelines, Tekton, ArgoCD, Github Actions ,Circle CI,Jenkins Infrastructure As Code ================= Terraform, Cloudformation Programming/Scripting Language ================== Golang, python, bash , nodeJS Monitoring & Logging =============== Prometheus, Grafana, ELK stack, cloud logging, cloud watch logging, Kibana I also possess key soft skills such as communication, self-learning, self-management, and leadership which are crucial for any role. I am also fluent in English and easy to go by. For more details, you are always welcome to contact me.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Cloud Computing
    DevOps
    Google Cloud Platform
    Kubernetes
    Jenkins
    Node.js
    CI/CD
    Docker
    HTML5
    React
    CSS 3
  • US$25 hourly
    With 6+ years of experience in AWS cloud architecture, full-stack development, data engineering, and automation scripting, I specialize in designing, deploying, and optimizing high-performance, scalable solutions. My deep technical skills span across the AWS ecosystem, Python backend frameworks, JavaScript, and advanced automation strategies tailored to complex business requirements. AWS Cloud Architecture & Engineering: • AWS Certified Developer - Associate • Extensive experience in architecting solutions with AWS services: AWS Lambda, API Gateway, Amazon ECS, AWS Glue, Elastic Beanstalk, Amazon EMR, AWS CloudFormation, AWS Step Functions, Amazon S3, SQS, SNS, Kinesis, and Amazon Athena • Design and implementation of Data Lake architectures and Data Warehousing solutions using AWS Lake Formation, Redshift, and Glue Catalog • Advanced ETL orchestration and pipeline automation, optimizing data flow and performance at scale • AWS consultancy with a focus on cost optimization, security best practices, and infrastructure automation Backend Engineering: • Expertise in Python frameworks: Django, Flask, FastAPI, and Girder for microservices and REST API development • Proficient with AWS SDKs, particularly Boto3, and libraries such as NumPy, PyTorch, Scikit-Learn, and Pandas for data processing and ML integration • Implementation of asynchronous programming, API versioning, and middleware design for robust and secure backend systems DevOps & Infrastructure as Code: • Proficient in CI/CD pipelines with AWS CodePipeline, Jenkins, ArgoCD, and GitHub Actions • Expertise in Infrastructure as Code (IaC) with Terraform and AWS CloudFormation, automating environment provisioning and management • Configuration management and automated deployments ensuring high availability and minimal downtime Frontend Development: • Advanced skills in JavaScript and frameworks like React, utilizing MUI, CSS, Bootstrap, and HTML for building responsive, component-driven UI architectures • Proficient in modern JavaScript (ES6+) for client-side logic, DOM manipulation, and asynchronous data handling with Promises and async/await • Experience in state management with Redux, Context API, and integrating RESTful APIs for seamless frontend-backend communication Automation & Web Scraping: • Advanced automation skills using Scrapy, Selenium, Puppeteer, and Splash for web data extraction • Expertise in headless browsing, JavaScript rendering, CAPTCHA bypass techniques, and dynamic proxy rotation for reliable scraping solutions • Development of robust, maintainable scripts for automated data collection and integration into ETL pipelines I am passionate about solving complex technical challenges and leveraging cloud-native technologies to deliver impactful solutions. Let’s collaborate to bring your projects to life with precision and technical excellence!
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Data Scraping
    ETL Pipeline
    PostgreSQL
    Django Stack
    DevOps
    FastAPI
    Django
    Serverless Stack
    Data Engineering
    React
    CI/CD
    Amazon Web Services
    Python
  • US$40 hourly
    I am an enthusiastic data engineer with a can-do approach to all data-related challenges. Therefore, I am always learning new things related to data technologies and developments. I’ve been working as a data engineer. Creating lots of advanced Python scripts and writing many advanced SQL queries and creating both batch and streaming data pipelines. Tech Stack: Languages: Python, SQL Big Data Tools/Frameworks: Apache Spark, Apache Hadoop, Apache Kafka Databases: MySQL, PostgreSQL Data Warehouses: Snowflake, BigQuery, Redshift Cloud Services: AWS, GCP Data Visualization: Looker, Metabase Data Orchestration: Apache Airflow Containerization: Docker, Kubernetes Data Transformation: DBT NoSQL: Cassandra, MongoDB Data Search: Elastic Stack CI/CD: GitHub, Gitlab, Jenkins You may look at my projects in the Portfolio section for details. Please take a look at my profiles: dogukanulu.dev github.com/dogukannulu medium.com/@dogukannulu
    vsuc_fltilesrefresh_TrophyIcon Big Data
    PostgreSQL
    Apache Cassandra
    Google Cloud Platform
    Looker
    Docker
    Elasticsearch
    pandas
    Apache Kafka
    Apache Airflow
    dbt
    Snowflake
    Apache Spark
    Amazon Web Services
    SQL
    Python
  • US$40 hourly
    I am a Sr Big Data Engineer with 15 years experience designing , developing and implementing data analytics applications, I have been working on Big Data Platforms using Spark and Scala/Java , Python, Hive, Pig , Sqoop, Oozie etc for last 10 years in an AWS env and Cloudera environments. Iam have expert level proficiency in creating data pipelines using various ETL methodologies, very good experience in writing Unix shell scripts, Python scripts, I am also very proficient in designing Databases/Data warehouse models , proficiency in writing Sql/PL SQL programming.
    vsuc_fltilesrefresh_TrophyIcon Big Data
    Apache Kafka
    Cloud Computing
    ETL Pipeline
    Apache Spark
    Oracle Applications
    RESTful API
    SQL Programming
    Java
    Unix Shell
    Python
    SQL
    Scala
  • Want to browse more freelancers?
    Sign up

How it works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Big Data Engineer on Upwork?

You can hire a Big Data Engineer on Upwork in four simple steps:

  • Create a job post tailored to your Big Data Engineer project scope. We’ll walk you through the process step by step.
  • Browse top Big Data Engineer talent on Upwork and invite them to your project.
  • Once the proposals start coming in, create a shortlist of top Big Data Engineer profiles and start to interview.
  • Hire the right Big Data Engineer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Big Data Engineer?

Rates charged by Big Data Engineers on Upwork can vary with a number of factors including experience, location and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Big Data Engineer on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Big Data Engineers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Big Data Engineer team you need to succeed.

Can I hire a Big Data Engineer within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Big Data Engineer proposals within 24 hours of posting a job description.

Schedule a call