Hire the best Hadoop Developers & Programmers in Texas
Check out Hadoop Developers & Programmers in Texas with the skills you need for your next job.
- $175 hourly
- 5.0/5
- (4 jobs)
Mr. Joshua B. Seagroves is a seasoned professional having served as an Enterprise Architect/Senior Data Engineer for multiple Fortune 100 Companies. With a successful track record as a startup founder and CTO, Mr. Seagroves brings a wealth of experience to his role, specializing in the strategic design, development, and implementation of advanced technology systems. Throughout his career, Mr. Seagroves has demonstrated expertise in architecting and delivering cutting-edge solutions, particularly in the realm of data engineering and sciences. He has successfully spearheaded the implementation of multiple such systems and applications for a diverse range of clients. As part of his current responsibilities, Mr. Seagroves actively contributes to the prototyping and research efforts in the field of data engineering/data science, specifically in the development of operational systems for critical mission systems. Leveraging his extensive background in architecture and software modeling methodologies, he has consistently led and collaborated with multidisciplinary teams, successfully integrating various distributed computing technologies, including Hadoop, NiFi, HBase, Accumulo, and MongoDB. Mr. Seagroves' exceptional professional achievements and extensive experience make him a highly sought-after expert in his field. His comprehensive knowledge and hands-on expertise in advanced technology systems and big data make him a valuable asset to any organization.HadoopYARNApache HadoopBig DataApache ZookeeperTensorFlowApache SparkApache NiFiApache KafkaArtificial Neural NetworkArtificial Intelligence - $125 hourly
- 4.8/5
- (14 jobs)
🏆 Achieved Top-Rated Freelancer status (Top 10%) with a proven track record of success. Past experience: Twitter, Spotify, & PwC. I am a certified data engineer & software developer with 5+ years of experience. I am familiar with almost all major tech stacks on data science/engineering and app development. If you require support in your projects, please do get in touch. Programming Languages: Python | Java | Scala | C++ | Rust | SQL | Bash Big Data: Airflow | Hadoop | MapReduce | Hive | Spark | Iceberg | Presto | Trino | Scio | Databricks Cloud: GCP | AWS | Azure | Cloudera Backend: Spring Boot | FastAPI | Flask AI/ML: Pytorch | ChatGPT | Kubeflow | Onnx | Spacy | Vertex AI Streaming: Apache Beam | Apache Flink | Apache Kafka | Spark Streaming SQL Databases: MSSQL | Postgres | MySql | BigQuery | Snowflake | Redshift | Teradata NoSQL Databases: Bigtable | Cassandra | HBase | MongoDB | Elasticsearch Devops: Terraform | Docker | Git | Kubernetes | Linux | Github Actions | Jenkins | GitlabHadoopJavaApache HadoopAmazon Web ServicesSnowflakeMicrosoft AzureGoogle Cloud PlatformDatabase ManagementLinuxApache SparkETLAPI IntegrationScalaSQLPython - $99 hourly
- 5.0/5
- (7 jobs)
With over 20 years of leadership in data storage, processing, and streaming technologies at multinational corporations like Microsoft, IBM, Bloomberg, and Amazon, I am recognized as a Subject Matter Expert in these domains. My portfolio includes the successful design and deployment of large-scale, multi-tier projects utilizing a variety of programming languages (C, C++, C#, Python, Java, Ruby) and both SQL and NoSQL databases, often enhanced with caching solutions. My expertise extends to data streaming products such as Kafka (including Confluent and Apache Kafka), Kinesis, and RabbitMQ, tailored to meet specific project requirements and customer environments. My technical proficiency encompasses a wide range of databases and data processing technologies, including MS SQL, MySQL, Postgres, Comdb2, Cassandra, MongoDB, Hadoop, HDFS, Hive, Spark, and Snowflake. I am equally adept in Unix and Windows environments, skilled in both PowerShell and Bash scripting. As an AWS and Azure Solutions Architect, I have empowered numerous clients with comprehensive AWS and Azure cloud solutions based on clients need. My notable projects on Upwork include: 1. Migrating Arcbest's dispatch solution from mainframe to Linux servers with Confluent Kafka, enhancing processing times and reducing latencies. 2. Conducting petabyte-scale big data analysis for Punchh using Snowflake, Kafka, Python, Ruby, AWS S3, and Redshift. 3. Analyzing and comparing various Kafka-like solutions for an investment firm, focusing on adoption and maintenance costs. 4. Implementing ETL solutions with CDC for continuous updates from IBM Maximo to Snowflake via Kafka, and from Oracle to Snowflake, integrating Power BI and Tableau for analytics. 5. Deploying an IoT solution for a logistics firm using Particle and Pulsar devices, MQTT, Kinesis, Lambda, API Gateway, S3, Redshift, MySQL Aurora, and Power BI to monitor real-time delivery metrics as well as post delivery analysis of delivery performance such as spills, tilts, bumps. 6. Conducting data analysis for an advertising firm, benchmarking BigQuery and custom dashboards against Redshift with Tableau/Quicksight.HadoopSQLR HadoopAmazon Web ServicesSnowflakeSolution ArchitectureApache SolrRubyApache KafkaApache HadoopApache CassandraRedisPythonJavaC++C# - $25 hourly
- 5.0/5
- (1 job)
Fractional CFO I Experienced QuickBooks Pro Advisor | Accounting Software Expert | Data Migration As a QuickBooks Pro Advisor with extensive experience managing over 200 clients, I specialize in transforming accounting processes and optimizing financial systems. My expertise spans a range of accounting software, including QuickBooks, Xero, Zoho, and Wave. I am proficient in data migration across different platforms, ensuring seamless transitions and accurate financial management. Throughout my career, I have successfully collaborated with diverse clients, providing tailored solutions that enhance financial performance and streamline accounting operations. My deep understanding of various accounting software allows me to offer customized advice, troubleshoot issues, and implement best practices for efficient financial management. As a freelancer, I am dedicated to delivering high-quality results within tight deadlines. My attention to detail, analytical skills, and problem-solving capabilities enable me to manage complex data migrations and software integrations with precision. Whether you need assistance with QuickBooks setup, software migration, or comprehensive financial management, I am equipped to handle your accounting needs with expertise and efficiency. Let’s connect to discuss how I can support your business with effective accounting solutions. I am eager to provide innovative, reliable services tailored to your specific requirements. Please contact me to schedule a call to discuss your project needs, timeline, and budget. Looking forward to collaborating with you.HadoopData AnalysisBig DataArtificial IntelligenceApache HadoopData MiningHiveData ScienceMachine LearningSQLTableauPythonNLTK - $50 hourly
- 0.0/5
- (0 jobs)
Hands-on design and development experience on Hadoop ecosystem (Hadoop, HBase, PIG, Hive, and MapReduce) including one or more of the following Big data related technologies - Scala, SPARK, Sqoop, Flume, Kafka and Python, strong ETL, PostgreSQL experience as well Strong background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and microservice architecture * Experience in Cloudera Stack, HortonWorks, and Amazon EMR * Strong experience in using Excel, SQL, SAS, Python and R to dump the data and analyze based on business needs. * Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter * Strong understanding and hands-on programming/scripting experience skills - UNIX shell * An excellent team player & technically strong person who hasHadoopAmazon S3Data Warehousing & ETL SoftwareBig DataAmazon Web ServicesHiveData ScienceETLData LakeData CleaningApache HiveApache HadoopApache SparkApache KafkaData MigrationETL Pipeline - $75 hourly
- 0.0/5
- (0 jobs)
Senior Data Engineer Senior Data Engineer with 14 years of extensive experience in designing and implementing large-scale data pipelines and cloud-based solutions. Expertise spans AWS, Azure, and Snowflake, with a proven track record in building and optimizing data platforms. Proficient in Big Data technologies, including Apache Spark and Hadoop, with deep knowledge of ETL processes, data modeling, and performance tuning.HadoopCSVJSONORCApache AvroParquetApache SparkApache HadoopBig Data File FormatBig DataPySparkOracleMicrosoft SQL ServerInformatica CloudDatabricks PlatformSnowflake - $75 hourly
- 0.0/5
- (0 jobs)
Professional Summary Data Engineer with over 13 years of IT experience and 5+ years specializing in Big Data and cloud services. Expertise in Hadoop, Apache Spark, and AWS, with a strong machine learning and AI background. Skilled in executing cloud migrations, optimizing data workflows, and delivering measurable performance and cost efficiency improvements. Adept at collaborating across teams to implement innovative solutions that address complex challenges.HadoopNoSQL DatabaseSnowflakeDatabricks PlatformSQLPySparkAWS LambdaAWS GlueAWS CodePipelineAWS CloudFormationApache KafkaApache HadoopData AnalysisMachine Learning ModelMachine LearningETL Pipeline - $60 hourly
- 0.0/5
- (0 jobs)
*9 years of experience in creating, implementing, and upgrading data structures. * Expertise with ETL and ELT procedures with Apache Airflow, Informatica, and NiFi. * Knowledge of Python, SQL, Java, and Scala programming languages. * Excellent knowledge of Spark, Hadoop, Kafka, and Hive for batch and real-time processing. * Proficient in AWS, Azure, and GCP, with a focus on scalable architecture. * Proficient in data modeling and warehousing tools such as Snowflake and Big Query. * Experienced with cloud data warehousing utilizing AWS S3, Glue, and Redshift, as well as cloud environment management with Azure services such as Active Directory, Data Factory, Key Vault, and Databricks. * Skilled in agile project management with JIRA and Git, and dedicated to ongoing professional growth in cloud computing and big data. *Excellent analytical and communication abilities, a quick learner and team player, well-organized, and self-motivated.HadoopPySparkGitHubKubernetesDockerSQLPythonAWS GlueAzure Blockchain ServiceApache HadoopETL - $40 hourly
- 0.0/5
- (0 jobs)
I am a highly skilled professional with a Master's in Business Analytics, I have expertise in Applied Natural Language Processing, Deep Learning, and Machine Learning. Her experience includes developing conversational Q&A chatbots and text summarization web interfaces, integrating models from Hugging Face and OpenAI. I have also worked on OCR, text classification, and content tagging systems, showcasing proficiency in PyTorch, TensorFlow, and AWS. I excels in optimizing image classification with different Deep learning models, achieving exceptional accuracy and leveraging tools like W&B for experimentation and tracking.HadoopR ShinyPythonPySparkApache HadoopDeep Learning ModelingLarge Language ModelDeep LearningDeep Neural NetworkMachine LearningMachine Learning Model - $20 hourly
- 0.0/5
- (0 jobs)
Data Engineer with 10+ years of experience working for leading US insurer. Extensive experience in Hadoop, Pyspark, Snowflake and dbt.HadoopPySparkApache HadoopdbtSnowflake - $18 hourly
- 0.0/5
- (0 jobs)
Data engineer and data analysis , sql and python , databricks and also problem solving whenever requiredHadoopDevOpsProblem SolvingDatabricks PlatformData AnalysisHivePySparkPythonSQLApache HadoopData EngineeringDatabase Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.