Hire the best Hadoop Developers & Programmers in Texas

Check out Hadoop Developers & Programmers in Texas with the skills you need for your next job.
  • $175 hourly
    Mr. Joshua B. Seagroves is a seasoned professional having served as an Enterprise Architect/Senior Data Engineer for multiple Fortune 100 Companies. With a successful track record as a startup founder and CTO, Mr. Seagroves brings a wealth of experience to his role, specializing in the strategic design, development, and implementation of advanced technology systems. Throughout his career, Mr. Seagroves has demonstrated expertise in architecting and delivering cutting-edge solutions, particularly in the realm of data engineering and sciences. He has successfully spearheaded the implementation of multiple such systems and applications for a diverse range of clients. As part of his current responsibilities, Mr. Seagroves actively contributes to the prototyping and research efforts in the field of data engineering/data science, specifically in the development of operational systems for critical mission systems. Leveraging his extensive background in architecture and software modeling methodologies, he has consistently led and collaborated with multidisciplinary teams, successfully integrating various distributed computing technologies, including Hadoop, NiFi, HBase, Accumulo, and MongoDB. Mr. Seagroves' exceptional professional achievements and extensive experience make him a highly sought-after expert in his field. His comprehensive knowledge and hands-on expertise in advanced technology systems and big data make him a valuable asset to any organization.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    YARN
    Apache Hadoop
    Big Data
    Apache Zookeeper
    TensorFlow
    Apache Spark
    Apache NiFi
    Apache Kafka
    Artificial Neural Network
    Artificial Intelligence
  • $125 hourly
    🏆 Achieved Top-Rated Freelancer status (Top 10%) with a proven track record of success. Past experience: Twitter, Spotify, & PwC. I am a certified data engineer & software developer with 5+ years of experience. I am familiar with almost all major tech stacks on data science/engineering and app development. If you require support in your projects, please do get in touch. Programming Languages: Python | Java | Scala | C++ | Rust | SQL | Bash Big Data: Airflow | Hadoop | MapReduce | Hive | Spark | Iceberg | Presto | Trino | Scio | Databricks Cloud: GCP | AWS | Azure | Cloudera Backend: Spring Boot | FastAPI | Flask AI/ML: Pytorch | ChatGPT | Kubeflow | Onnx | Spacy | Vertex AI Streaming: Apache Beam | Apache Flink | Apache Kafka | Spark Streaming SQL Databases: MSSQL | Postgres | MySql | BigQuery | Snowflake | Redshift | Teradata NoSQL Databases: Bigtable | Cassandra | HBase | MongoDB | Elasticsearch Devops: Terraform | Docker | Git | Kubernetes | Linux | Github Actions | Jenkins | Gitlab
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Java
    Apache Hadoop
    Amazon Web Services
    Snowflake
    Microsoft Azure
    Google Cloud Platform
    Database Management
    Linux
    Apache Spark
    ETL
    API Integration
    Scala
    SQL
    Python
  • $99 hourly
    With over 20 years of leadership in data storage, processing, and streaming technologies at multinational corporations like Microsoft, IBM, Bloomberg, and Amazon, I am recognized as a Subject Matter Expert in these domains. My portfolio includes the successful design and deployment of large-scale, multi-tier projects utilizing a variety of programming languages (C, C++, C#, Python, Java, Ruby) and both SQL and NoSQL databases, often enhanced with caching solutions. My expertise extends to data streaming products such as Kafka (including Confluent and Apache Kafka), Kinesis, and RabbitMQ, tailored to meet specific project requirements and customer environments. My technical proficiency encompasses a wide range of databases and data processing technologies, including MS SQL, MySQL, Postgres, Comdb2, Cassandra, MongoDB, Hadoop, HDFS, Hive, Spark, and Snowflake. I am equally adept in Unix and Windows environments, skilled in both PowerShell and Bash scripting. As an AWS and Azure Solutions Architect, I have empowered numerous clients with comprehensive AWS and Azure cloud solutions based on clients need. My notable projects on Upwork include: 1. Migrating Arcbest's dispatch solution from mainframe to Linux servers with Confluent Kafka, enhancing processing times and reducing latencies. 2. Conducting petabyte-scale big data analysis for Punchh using Snowflake, Kafka, Python, Ruby, AWS S3, and Redshift. 3. Analyzing and comparing various Kafka-like solutions for an investment firm, focusing on adoption and maintenance costs. 4. Implementing ETL solutions with CDC for continuous updates from IBM Maximo to Snowflake via Kafka, and from Oracle to Snowflake, integrating Power BI and Tableau for analytics. 5. Deploying an IoT solution for a logistics firm using Particle and Pulsar devices, MQTT, Kinesis, Lambda, API Gateway, S3, Redshift, MySQL Aurora, and Power BI to monitor real-time delivery metrics as well as post delivery analysis of delivery performance such as spills, tilts, bumps. 6. Conducting data analysis for an advertising firm, benchmarking BigQuery and custom dashboards against Redshift with Tableau/Quicksight.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    SQL
    R Hadoop
    Amazon Web Services
    Snowflake
    Solution Architecture
    Apache Solr
    Ruby
    Apache Kafka
    Apache Hadoop
    Apache Cassandra
    Redis
    Python
    Java
    C++
    C#
  • $25 hourly
    Fractional CFO I Experienced QuickBooks Pro Advisor | Accounting Software Expert | Data Migration As a QuickBooks Pro Advisor with extensive experience managing over 200 clients, I specialize in transforming accounting processes and optimizing financial systems. My expertise spans a range of accounting software, including QuickBooks, Xero, Zoho, and Wave. I am proficient in data migration across different platforms, ensuring seamless transitions and accurate financial management. Throughout my career, I have successfully collaborated with diverse clients, providing tailored solutions that enhance financial performance and streamline accounting operations. My deep understanding of various accounting software allows me to offer customized advice, troubleshoot issues, and implement best practices for efficient financial management. As a freelancer, I am dedicated to delivering high-quality results within tight deadlines. My attention to detail, analytical skills, and problem-solving capabilities enable me to manage complex data migrations and software integrations with precision. Whether you need assistance with QuickBooks setup, software migration, or comprehensive financial management, I am equipped to handle your accounting needs with expertise and efficiency. Let’s connect to discuss how I can support your business with effective accounting solutions. I am eager to provide innovative, reliable services tailored to your specific requirements. Please contact me to schedule a call to discuss your project needs, timeline, and budget. Looking forward to collaborating with you.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Data Analysis
    Big Data
    Artificial Intelligence
    Apache Hadoop
    Data Mining
    Hive
    Data Science
    Machine Learning
    SQL
    Tableau
    Python
    NLTK
  • $50 hourly
    Hands-on design and development experience on Hadoop ecosystem (Hadoop, HBase, PIG, Hive, and MapReduce) including one or more of the following Big data related technologies - Scala, SPARK, Sqoop, Flume, Kafka and Python, strong ETL, PostgreSQL experience as well Strong background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and microservice architecture * Experience in Cloudera Stack, HortonWorks, and Amazon EMR * Strong experience in using Excel, SQL, SAS, Python and R to dump the data and analyze based on business needs. * Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter * Strong understanding and hands-on programming/scripting experience skills - UNIX shell * An excellent team player & technically strong person who has
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    Amazon S3
    Data Warehousing & ETL Software
    Big Data
    Amazon Web Services
    Hive
    Data Science
    ETL
    Data Lake
    Data Cleaning
    Apache Hive
    Apache Hadoop
    Apache Spark
    Apache Kafka
    Data Migration
    ETL Pipeline
  • $75 hourly
    Senior Data Engineer Senior Data Engineer with 14 years of extensive experience in designing and implementing large-scale data pipelines and cloud-based solutions. Expertise spans AWS, Azure, and Snowflake, with a proven track record in building and optimizing data platforms. Proficient in Big Data technologies, including Apache Spark and Hadoop, with deep knowledge of ETL processes, data modeling, and performance tuning.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    CSV
    JSON
    ORC
    Apache Avro
    Parquet
    Apache Spark
    Apache Hadoop
    Big Data File Format
    Big Data
    PySpark
    Oracle
    Microsoft SQL Server
    Informatica Cloud
    Databricks Platform
    Snowflake
  • $75 hourly
    Professional Summary Data Engineer with over 13 years of IT experience and 5+ years specializing in Big Data and cloud services. Expertise in Hadoop, Apache Spark, and AWS, with a strong machine learning and AI background. Skilled in executing cloud migrations, optimizing data workflows, and delivering measurable performance and cost efficiency improvements. Adept at collaborating across teams to implement innovative solutions that address complex challenges.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    NoSQL Database
    Snowflake
    Databricks Platform
    SQL
    PySpark
    AWS Lambda
    AWS Glue
    AWS CodePipeline
    AWS CloudFormation
    Apache Kafka
    Apache Hadoop
    Data Analysis
    Machine Learning Model
    Machine Learning
    ETL Pipeline
  • $60 hourly
    *9 years of experience in creating, implementing, and upgrading data structures. * Expertise with ETL and ELT procedures with Apache Airflow, Informatica, and NiFi. * Knowledge of Python, SQL, Java, and Scala programming languages. * Excellent knowledge of Spark, Hadoop, Kafka, and Hive for batch and real-time processing. * Proficient in AWS, Azure, and GCP, with a focus on scalable architecture. * Proficient in data modeling and warehousing tools such as Snowflake and Big Query. * Experienced with cloud data warehousing utilizing AWS S3, Glue, and Redshift, as well as cloud environment management with Azure services such as Active Directory, Data Factory, Key Vault, and Databricks. * Skilled in agile project management with JIRA and Git, and dedicated to ongoing professional growth in cloud computing and big data. *Excellent analytical and communication abilities, a quick learner and team player, well-organized, and self-motivated.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    PySpark
    GitHub
    Kubernetes
    Docker
    SQL
    Python
    AWS Glue
    Azure Blockchain Service
    Apache Hadoop
    ETL
  • $40 hourly
    I am a highly skilled professional with a Master's in Business Analytics, I have expertise in Applied Natural Language Processing, Deep Learning, and Machine Learning. Her experience includes developing conversational Q&A chatbots and text summarization web interfaces, integrating models from Hugging Face and OpenAI. I have also worked on OCR, text classification, and content tagging systems, showcasing proficiency in PyTorch, TensorFlow, and AWS. I excels in optimizing image classification with different Deep learning models, achieving exceptional accuracy and leveraging tools like W&B for experimentation and tracking.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    R Shiny
    Python
    PySpark
    Apache Hadoop
    Deep Learning Modeling
    Large Language Model
    Deep Learning
    Deep Neural Network
    Machine Learning
    Machine Learning Model
  • $20 hourly
    Data Engineer with 10+ years of experience working for leading US insurer. Extensive experience in Hadoop, Pyspark, Snowflake and dbt.
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    PySpark
    Apache Hadoop
    dbt
    Snowflake
  • $18 hourly
    Data engineer and data analysis , sql and python , databricks and also problem solving whenever required
    vsuc_fltilesrefresh_TrophyIcon Hadoop
    DevOps
    Problem Solving
    Databricks Platform
    Data Analysis
    Hive
    PySpark
    Python
    SQL
    Apache Hadoop
    Data Engineering
    Database
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by 5M+ businesses