Hire the best Big Data Engineers in Vietnam
Check out Big Data Engineers in Vietnam with the skills you need for your next job.
- $30 hourly
- 5.0/5
- (43 jobs)
✅ 𝙏𝙊𝙋 𝙍𝙖𝙩𝙚𝙙 𝘿𝙚𝙫𝙊𝙥𝙨 𝙀𝙣𝙜𝙞𝙣𝙚𝙚𝙧 | ✅ 𝟱-𝙎𝙩𝙖𝙧 𝙍𝙚𝙫𝙞𝙚𝙬𝙨 | ✅ 𝟳+ 𝙔𝙚𝙖𝙧𝙨 𝙤𝙛 𝙀𝙭𝙥𝙚𝙧𝙞𝙚𝙣𝙘𝙚 I’m a passionate AWS DevOps & AI-Powered Data Engineer with hands-on experience architecting, automating, and optimizing mission-critical cloud solutions. I specialize in serverless AI integration using AWS Bedrock and designing intelligent workflows that scale. 𝙄 𝙝𝙖𝙫𝙚 𝙚𝙭𝙥𝙚𝙧𝙞𝙚𝙣𝙘𝙚 𝙞𝙣 𝙩𝙝𝙚 𝙛𝙤𝙡𝙡𝙤𝙬𝙞𝙣𝙜 𝙖𝙧𝙚𝙖𝙨, 𝙩𝙤𝙤𝙡𝙨 𝙖𝙣𝙙 𝙩𝙚𝙘𝙝𝙣𝙤𝙡𝙤𝙜𝙞𝙚𝙨: 🧠 𝘼𝙄 & 𝙂𝙚𝙣𝙚𝙧𝙖𝙩𝙞𝙫𝙚 𝘼𝙄 𝙬𝙞𝙩𝙝 𝘼𝙒𝙎 𝘽𝙚𝙙𝙧𝙤𝙘𝙠 ► Integrated LLMs (Claude, Mistral, Titan) into serverless applications using AWS Bedrock + Lambda + Step Functions ► Designed prompt orchestration pipelines for data enrichment, customer support, and automated insights ► Built secure, scalable AI workflows for multi-tenant environments ☁️ 𝘾𝙇𝙊𝙐𝘿 𝙄𝙉𝙁𝙍𝘼𝙎𝙏𝙍𝙐𝘾𝙏𝙐𝙍𝙀 ► AWS: EC2, S3, RDS, ECS, EKS, ECR, Lambda, IAM, VPC, Route53, CloudTrail, KMS, CloudWatch, Redshift, DynamoDB, Glue, Athena, Step Functions, Bedrock ► Azure: Data Factory, Blob Storage, Cosmos DB, Azure DevOps 🗄️ 𝘿𝘼𝙏𝘼𝘽𝘼𝙎𝙀𝙎 ► SQL | NoSQL | MySQL | PostgreSQL | SQL Server | MongoDB | Redis | DynamoDB | ElasticCache 🚀 𝘾𝙄/𝘾𝘿 & 𝘼𝙐𝙏𝙊𝙈𝘼𝙏𝙄𝙊𝙉 ► Jenkins | AWS CodePipeline, CodeBuild, CodeDeploy | GitHub Actions | Terraform | CloudFormation 🔐 𝙄𝘿𝙀𝙉𝙏𝙄𝙏𝙔 & 𝙎𝙀𝘾𝙐𝙍𝙄𝙏𝙔 ► AWS IAM Identity Center (SSO) | AWS Cognito | Okta | AWS WAF | Network Firewall 🧩 𝘽𝙄𝙂 𝘿𝘼𝙏𝘼 & 𝘿𝘼𝙏𝘼 𝙀𝙉𝙂𝙄𝙉𝙀𝙀𝙍𝙄𝙉𝙂 ► Apache Spark | Hadoop | EMR | Kinesis | Data Lakes | Glue ETL | Airflow | Athena 📊 𝘽𝙄 & 𝙈𝙊𝙉𝙄𝙏𝙊𝙍𝙄𝙉𝙂 ► Power BI | Tableau | Superset | Grafana | SSMS 🛠️ 𝙏𝙊𝙊𝙇𝙎 & 𝙇𝘼𝙉𝙂𝙐𝘼𝙂𝙀𝙎 ► Python | Docker | Kubernetes | Bash | Ansible | REST APIs | JSON | YAML 🧪 𝙎𝙤𝙢𝙚 𝙤𝙛 𝙢𝙮 𝙢𝙖𝙟𝙤𝙧 𝙥𝙧𝙤𝙟𝙚𝙘𝙩𝙨 𝙞𝙣𝙘𝙡𝙪𝙙𝙚𝙙 ► Built AI-powered data enrichment workflows using AWS Bedrock + Lambda ► Architected fully automated CI/CD pipelines for Dockerized apps on ECS ► Administered infrastructure for enterprise customers with $30k+ AWS spend ► Designed scalable Data Lake solutions with EMR + Glue + Athena ► Implemented centralized identity access via AWS SSO (IAM Identity Center) ► Automated AWS Route53, DNS, and Glue deployment for 50+ microservices 📜 𝘾𝙀𝙍𝙏𝙄𝙁𝙄𝙀𝘿 ► AWS Certified Solutions Architect (2018) ► AWS Certified Data Analytics – Specialty (2021) ► AWS Certified Security – Specialty (2023) ► Microsoft Certified: Azure Administrator Associate (2023) ► Cisco Certified Network Associate (2013) 💬 𝙄 𝙗𝙧𝙞𝙣𝙜 𝙣𝙤𝙩 𝙟𝙪𝙨𝙩 𝙩𝙚𝙘𝙝𝙣𝙞𝙘𝙖𝙡 𝙨𝙠𝙞𝙡𝙡𝙨 𝙗𝙪𝙩 𝙖 𝙘𝙤𝙢𝙢𝙞𝙩𝙢𝙚𝙣𝙩 𝙩𝙤 𝙥𝙧𝙚𝙘𝙞𝙨𝙞𝙤𝙣, 𝙧𝙚𝙨𝙥𝙤𝙣𝙨𝙞𝙫𝙚𝙣𝙚𝙨𝙨, 𝙖𝙣𝙙 𝙘𝙡𝙖𝙧𝙞𝙩𝙮. Whether you're looking to launch a new AI feature, optimize your DevOps pipelines, or need a reliable Linux system engineer to manage your infrastructure—I’m ready to help. 𝙇𝙚𝙩’𝙨 𝙢𝙖𝙠𝙚 𝙮𝙤𝙪𝙧 𝙞𝙣𝙛𝙧𝙖𝙨𝙩𝙧𝙪𝙘𝙩𝙪𝙧𝙚 𝙨𝙢𝙖𝙧𝙩𝙚𝙧, 𝙛𝙖𝙨𝙩𝙚𝙧, 𝙖𝙣𝙙 𝙛𝙪𝙩𝙪𝙧𝙚-𝙧𝙚𝙖𝙙𝙮. Available for long-term or short-term contracts!Big Data
AWS CodePipelineAmazon Elastic BeanstalkDevOps EngineeringCI/CDETLCloud ArchitectureLinuxCisco Certified Network AssociateCloud ManagementData WarehousingSystem AdministrationDevOpsAmazon Web ServicesPython - $20 hourly
- 5.0/5
- (6 jobs)
Problem solver, team player, language agnostic. Fueled by curiosity to explore, improve, optimize things.Big Data
Optimization ModelingArtificial IntelligenceMachine Learning - $35 hourly
- 5.0/5
- (5 jobs)
Over 8 years of expertise in both backend and frontend development, I'm enthusiastic about joining and learning from new and challenging projects. Below are the previous jobs and projects in which I played a key contributing role.Big Data
DockerTypeScriptJavaScriptPHPNodeJS FrameworkVue.jsReactGolangArchitectural DesignFull-Stack Development - $40 hourly
- 5.0/5
- (80 jobs)
I have 10 years working experience and I'm confident about my skill in these roles: - Big data engineer. - Web developer. - Backend engineer. - DebOps Most of my freelance project is related to Piwik frameworks now is Matomo Open Analytics Platform As a big data engineer: - I built many analytics systems that be able to serve millions of pageview a month with Hadoop and Kafka and Spark. - I combined Redash and BigQuery to build report dashboard that enable BI team to bring out important decisions for business. - I customized Piwik to get over beyond the role of web analytics. We might use it to track everything from User behavior, realtime monitoring or system performance. - My task is including: + Install Piwik on cloud. + Optimize current Piwik. + Debug of tracking problem + Writing new plugin and widget that display many kind of charts and data. + Integrate Piwik API with other website to provide own control report & dashboard. As a backend Engineer: - I worked with Django rest framework running on Google App Engine. - I'm be able to use Datastore and ndb for financial transaction. - Using Python and Go to build micro services. - Gain deep knowledge on Python, Django and Google App Engine. As a web Developer: - I'm very excited to play with opensource. I could develop system based on large open source project like Piwik, Yii, and Laravel and Zend. It's take time for understanding a platform, but after that, I could do everything with it, include extending, writing plugin, customizing core or even rewrite it as my own way. - My favorite PHP framework is Yii because of fast and clean. I also worked with couponic and pinnect - uniprogy framework (Based on Yii). As a data engineer: - Doing ETL for many projects. - Handle traffic of a lot of traffic 40 millions pageviews/ 60GB a day. - Checking fake click by fuzzy logic algorithm Dempster-Shafer theory of evidence. - Parsing log into MySQL, Postgre, Hadoop, BigQuery. - Web Analytics for website, ecommerce, online newspaper and advertising network. - Batch & Real-time processing data to bring out report dashboard. - Calculate credit score. As a DevOps: - I could also deal with Linux system. Ubuntu is my favorite operating system. I could setup and maintain servers, whatever it running on debian, centos, Docker, Amazon EC2 or Google Cloud Platform. - Use CircleCI/Jenkins to manage and run tests and perform continuous integration. - I could work with linux shell script or python script to do daily task or background job processing on Linux server. - Dockerize server based service to miro-service that able to run on Kubernetes If you're interested in me, I could help you on your projects and contribute to your successful.Big Data
Python ScriptPiwik PROCI/CDLinux System AdministrationGoogle App EngineAnalyticsGoogle Cloud PlatformRESTful APIPythonPHPAmazon Web ServicesDjango - $25 hourly
- 5.0/5
- (10 jobs)
I would like to contribute my skills and knowledge to the project and become a leader or technical expert for my career path. With over 5 years of experience in web development for both front-end, back-end, and infrastructure, I'm proficient with using frameworks Ruby on Rails, NodeJS, and ReactJS. Having a broad knowledge of these technologies allows me to help clients in making decisions to select the most suitable solution for their ideas. Not only I can develop websites based on the specs, but I also work closely with clients to help them define their specs. I can participate in all stages in web development, from clarifying requirements to deploying to end-users.Big Data
VuexReduxReactVue.jsGolangMySQLRuby on RailsJavaScriptPostgreSQL - $15 hourly
- 5.0/5
- (1 job)
Skills: - Backend: Python (Django, Flask, FastAPI), Ruby (Ruby On Rails), Go (Gin) - Data: ETL (Airflow, Google Data Fusion, Google Dataflow), Pandas, Hadoop, Spark - Frontend: ReactJS, VueJS - Cloud: AWS, GCP, Azure - Mobile: React Native - Message Queue: Kafka, RabbitMQ, Redis - Basic: HTML, CSS, JavaScript As an adept software engineer, I'm able to complete tasks in the shortest time with responsibility and patience.Big Data
Ruby on RailsAPI DevelopmentNoSQL DatabaseRelational DatabaseGitWeb DevelopmentReactVue.jsDjangoPython - $7 hourly
- 5.0/5
- (5 jobs)
With over a decade of experience in the finance management, I am a professional specializing in cost analysis, big data management, pricing and financial modeling. My expertise includes: 1. Financial Analysis + Conducting deep analysis in Revenue (Segmentation, Retention rate, Customer life-time value,...) to prioritize most effect metrics in growth + Cost driver analysis: analyzing each cost line to figure out the drive for Cost optimization and Financial Modeling + Financial Statements: strong understanding correlation within Balance sheet, Income statement and Cashflow report, applying in Forecasting and Evaluating company's financial health 2. Microsoft Excel: + Building up complex Financial Model for different purpose (Fund raising, Pricing, Cost management, … ) + Working with big data to generate Reports/Dashboard/Pivot table in Excel (Accounting data, Labor cost, Sales funnel, …) 3. Big Data Management (Big query, Looker/Data Studio) + Manage big data (10-20 million data per month) for analysis (forecasting, real-time dashboard, cost optimization,...) + Consolidate data from different sources (Excel, Google Sheet, PDF, Big query,...) into one data warehouse and visualize for different end user (management team, investors, customers,...) I would welcome the opportunity to discuss how my experience and skills can contribute to your work.Big Data
AccountingData EntryMicrosoft ExcelPricingFinanceCost Analysis - $15 hourly
- 0.0/5
- (2 jobs)
I am a Fullstack Developer with eager to be a Solution Architect, and a talented leader. I posses good communication skill, teamwork skill, situational suggestion skill. I also have experience on web development, data mining, database skill. I judge myself a person who adapt quickly to new working environment and new technologies. But most importantly, I'm a hilarious person. For more information, DM me.Big Data
Nuxt.jsAmazon Web ServicesEnglishData ManagementData MiningApache StormTranslationDjangoKubernetesApache KafkaApache AirflowPython - $10 hourly
- 0.0/5
- (0 jobs)
SUMMARY I'm strong in analyzing and interpretating big data with visualizing skills for helpful, insightful information to seek and solve the problems. An analytical, careful and disciplined person with more than 4 years working in Quality assurance/ Quality Analyst/ Data analyst.Big Data
Operational PlanningManufacturingQuality AssuranceTest Results & AnalysisQA TestingSoftware QATestingBusiness ManagementManagement Skills - $30 hourly
- 0.0/5
- (0 jobs)
I’m a developer experienced in building solutions for small and medium-sized businesses. Whether you’re trying to win work, list your services, or create a new online store, I can help. - Languages: Python, Java/JavaScript, TypeScript, C#, C/C++ - Technologies& frameworks: Spring/Spring Boot, JavaEE, JakartaEE, React, Angular, Apache Kafka/Kafka Connect, Apache Flink, Confluent Platform, Cloudera Platform, OpenCV, TensorFlow, Keras, Pytorch, LLM,...Big Data
JavaScriptTypeScriptMobile AppAutomationMySQLOracleMachine LearningWeb ApplicationWeb DevelopmentC#PythonJava - $15 hourly
- 0.0/5
- (1 job)
PROFILE I am a skilled Data Engineer with experience in managing large datasets in GCP's BigQuery. I excel in conducting ETL/ELT processes, automating workflows with Python, and supporting teams with insightful ad hoc queries. I've also designed complex data pipelines using tools like Spark, Kafka, and Airflow, collaborating effectively across global teams to drive data-driven decisions. PERSONAL PROJECT * Design a near realtime ETL data pipeline analyzing log from a recruitment platform Raw log data from a recruitment platform is loaded into Cassandra. After that Spark is used to extract data from Cassandra and MySQL to process log data and then push final data to MySQL data warehouse through Kafka. Finally, Grafana is connected to MySQL to produce real-time dashboards. The pipeline above is orchestrated using Airflow. Github: click here * Big data processing: Customer behavior This project is created to discover customer's searching behavior in the first two weeks of June andBig Data
Looker StudioBigQueryOpenAI APIGoogle Cloud PlatformPythonSQLData ModelingData EngineeringData AnalyticsData ExtractionData MiningETL PipelineETL - $13 hourly
- 0.0/5
- (0 jobs)
System Programmer | C/C++ Expert | Cross-Platform & Embedded Developer Hi! I’m a system programmer with a strong focus on performance, portability, and clean design. I specialize in ANSI C and modern C++ (C89 to C++20), building software that runs reliably across Windows, Linux, macOS, Android, and embedded systems—without any external dependencies. I’m the creator of SimpleLog-Challenge/SimpleLog-Topic, an open-source, ultra-fast, multi-threaded and multi-process logging library that: Processes over 1 million logs/second Outperforms spdlog by 2x–4x in real-world benchmarks Supports log levels, rotation, topics, and nanosecond precision Works across desktops, servers, and embedded platforms Fully compatible from ANSI C89 to modern C++20 You can explore my work on GitHub. 🛠️ Core Skills: System-level programming (POSIX, Win32, termios) Cross-platform C/C++ (C89 – C++20) Multithreading & multiprocessing High-performance logging & diagnostics Serial port & protocol communication Embedded & bare-metal software Clean API design & Unix-style development 🛠️ With Yocto: 2 of my libraries was accepted by meta-openembedded of Yocto project: a. simplelog-topic. b. libserialmoduleBig Data
MathematicsBash ProgrammingBashC++Computing & NetworkingAlgorithm DevelopmentWiresharkTCPSystem ProgrammingPOSIXLinuxCArchitectural DesignComputer Network - $25 hourly
- 0.0/5
- (0 jobs)
I am a dedicated and detail-oriented Data Engineer with over 1 year of professional experience working on real-world data systems. I specialize in Databases, Data Warehousing, and Big Data processing, particularly with Apache Spark. My expertise includes: ✔️ Designing and managing relational databases ✔️ Developing ETL pipelines to transform raw data into usable insights ✔️ Implementing star schema models with fact and dimension tables ✔️ Processing large-scale data using PySpark ✔️ Writing optimized SQL queries and managing database performanceBig Data
Database ManagementData ModelingETL PipelinePySparkNeo4jSQLDatabaseData Engineering - $7 hourly
- 0.0/5
- (0 jobs)
Desire to apply knowledge and experience in data analysis to develop data products. Continuously update knowledge, cultivate and accumulate more experience and knowledge in data analysis to improve expertise and contribute to the business.Big Data
ChatGPTPostgreSQLPostGISMongoDBEnglishRemote SensingArcGISQGISGISMicrosoft Power BISQLPythonProblem SolvingData Analysis - $5 hourly
- 0.0/5
- (0 jobs)
As a hard worker, I always want to develop myself. I'm new to the Upwork platform but I won't let you down with my skills and passion. I aim to work for a client who offers a promising career by making use of my potential abilities to an optimum level in a professional environment. I wish to play a constructive role, not only individually but also as a team member to achieve total client satisfaction. I'm trying to shift my focus entirely to freelancing, and I'm looking for opportunities to utilize my experience in this field and my knowledge of the internet and computing in general. Almost any kind of data mining & entry is welcome and I will do my best to finish the job accurately, on time, and as cost-effective as possible for you as a potential employer. -Thanks-Big Data
Problem ResolutionDatabaseTransaction Data EntryData CollectionData ExtractionData MiningProblem SolvingOnline ResearchData CleaningList BuildingMicrosoft Excel - $12 hourly
- 0.0/5
- (0 jobs)
I'm a Data Engineer. - Programming & Querying: Python, SQL (PostgreSQL, MySQL) - Big Data & Streaming: Apache Spark, Apache Kafka - Workflow Orchestration: Apache Airflow - Cloud & DevOps: Azure, Docker ✅ What I Bring: - End-to-end data pipeline development - Real-time and batch processingBig Data
Data LakeData WarehousingPythonData CleaningOrchestrationDatabasePostgreSQLMySQLApache AirflowApache KafkaETLETL PipelineData ExtractionPySpark Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.