Hire the best Hadoop Developers & Programmers in Australia

Check out Hadoop Developers & Programmers in Australia with the skills you need for your next job.
  • $30 hourly
    I’m a developer with experience in building websites for small and medium sized businesses. Whether you’re trying to win work, list your services or even create a whole online store – I can help! I’m experienced in HTML and CSS 3, PHP, jQuery, WordpPess and SEO I’ll fully project manage your brief from start to finish Regular communication is really important to me, so let’s keep in touch!
    Featured Skill Hadoop
    Project Management
    HTML5
    Linux
    Apache Hadoop
    CSS 3
    MySQL
    IT Support
    JavaScript
    Python
    Office 365
  • $150 hourly
    Experienced data engineer specializing in high-performance data processing with Apache Spark, Python, and Databricks. I help businesses optimize their data pipelines, reduce processing costs, and accelerate insights delivery. My Approach: I don’t just write code - I solve business problems. Every optimization I deliver focuses on reducing costs, improving reliability, and accelerating time-to-insights. I provide clear documentation and knowledge transfer so your team can maintain the solutions
    Featured Skill Hadoop
    Data Extraction
    Git
    CI/CD
    Schema Diagram
    Data Migration
    Data Modeling
    Unix Shell
    Apache Hadoop
    Data Warehousing
    ETL Pipeline
    Databricks Platform
    Python
    SQL
    PySpark
    Apache Spark
  • $30 hourly
    I’m a Data Engineer with a decade of experience in the field. I have extensive expertise in database management, particularly with Oracle, MySQL, SQL Server, and DB2. My skill set includes designing and optimizing data warehouses, executing complex ETL processes, and performing data migrations with precision. I’m highly proficient in working with big data platforms, including HDFS, Spark, Hive, and Impala. I have a solid understanding of Oozie for workflow scheduling and Python for scripting and automating tasks. My focus is always on building scalable, efficient data pipelines that ensure data integrity and support robust analytics. I’m passionate about leveraging technology to solve complex data challenges and continuously exploring new tools and methodologies to stay ahead in the ever-evolving landscape of data engineering.
    Featured Skill Hadoop
    Unix Shell
    Big Data
    Data Modeling
    Apache Hadoop
    PySpark
    Data Engineering
    Data Migration
    Data Warehousing & ETL Software
    Python
    Data Analysis
    Data Extraction
    ETL
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.