Hire the best Hadoop Developers & Programmers in California
Check out Hadoop Developers & Programmers in California with the skills you need for your next job.
- $100 hourly
- 5.0/5
- (5 jobs)
As a highly skilled and accomplished freelance Data Scientist and Data Engineer, I possess a Master's degree in Data Science with a specialization in Artificial Intelligence. My expertise lies in converting intricate data into actionable insights that drive impactful decision-making. With proficiency in Python, PySpark, Google Cloud, Azure, and more, I am well-versed in leveraging these tools to craft and deploy scalable machine learning models while establishing robust data infrastructures. My impressive track record speaks for itself. I have successfully scaled machine learning infrastructure to cater to 100,000+ customers, implementing over 100 parallel cost prediction models. Furthermore, I have excelled in developing high-capacity solutions for data inferencing, resulting in substantial cost savings through infrastructure optimization and cloud computing efficiency. Navigating complex backend migrations seamlessly, I have significantly enhanced data management and system efficiency. By collaborating closely with Data Engineers and DevOps professionals, I have deployed production models and generated pivotal insights, fostering a collaborative environment where synergies thrive. Combining a strong academic foundation in Artificial Intelligence with extensive practical experience, I play a pivotal role in supporting decision-making processes and contributing to clients' strategic objectives. With a meticulous eye for detail and an unwavering commitment to excellence, I take pride in delivering error-free work. Recognizing the critical role of accurate and high-quality data in driving informed decisions, I ensure each project receives my utmost dedication and expertise. As a freelance professional, I am driven by a passion to provide exceptional results every time. With an unwavering focus on excellence, I bring a wealth of skills and experiences to the table, elevating each project to new heights.HadoopApache HadoopCloud ComputingGitGoogle Cloud PlatformDevOpsPySparkData SciencePythonDatabricks PlatformAzure Machine LearningDeep LearningMachine Learning - $50 hourly
- 5.0/5
- (1 job)
An experienced software developer of 13+ years who has performed over 10 ETL integrations and earned twelve certifications (including a 97% on the Stanford machine learning certification). It would be my pleasure to assist you with any machine learning or data science projects. I have lead and managed many projects and my focus on communication and the client's objectives are hallmark qualities of mine.HadoopData EngineeringETL PipelineApache HadoopLinuxAmazon Web ServicesData StructuresNeural NetworkSQLMachine Learning - $110 hourly
- 5.0/5
- (1 job)
I'm an expert data scientist specializing in fraud, customer segmentation, and creating software for predictive modeling. With 15 years of experience in health care, digital advertising, telecommunications, and engineering, I've developed a unique ability to solve the biggest problems in business with innovative ideas. My education includes an MBA from Cornell University, an MS from Northwestern University, and a BS in Mathematics from University of California.HadoopSnowflakeVue.jsJavaScriptUI/UX PrototypingFull-Stack DevelopmentCustomer RetentionMarket Segmentation ResearchApache HadoopAutomationDeep Neural NetworkSQLPythonChatbotComputer VisionAlgorithm Development - $100 hourly
- 0.0/5
- (1 job)
At UC Berkeley, I helped submit several Computer Vision and Reinforcement Learning papers. Here is a short list for context: - GANs for Model-Based Reinforcement Learning - Frame Rate Upscaling with Constitutional Networks - Neural Multi-Style Transfer At Amazon, I built a pipeline framework to store and serve sales data for the millions of third party merchants on Amazon.com. More recently, I have taken on part-time consulting. These are some of the clients and projects I have worked on in the past: - GitHub on Improving Code Classification with SVMs - SAP on Applying HANA Vora to Load Forecasting - Intuit on Quantifying Brand Exposure From Unstructured Text As opposed to these previous projects, I am looking to take on more projects, each with smaller time commitments.HadoopETLApache HadoopMachine LearningDeep LearningTensorFlowKerasPythonJavaComputer VisionApache Spark - $140 hourly
- 4.5/5
- (1 job)
I am Sr Architect, proven leader in Reliability Engineering, DevOps and Cloud Computing World with over 15 years experience managing highly critical Infrastructure for businesses. Have proven track record of helping Organizations with getting started on Cloud or either setting up for scale or ensuring reliability of the systems. Well versed with implementation of latest CI/CD, WAF, Security and Compliance/certifications such as CCF, PCI etc for organizational needs. I am experienced with Cloud Infrastructure, especially AWS, Data systems and stream/event processing and batch processing pipelines. Any Data related question/work, I am here to help.HadoopAmazon S3Amazon EC2AWS Systems ManagerAWS CloudFormationDatabase DesignApache SparkOracle Database AdministrationDevOps EngineeringApache HadoopDatabase ArchitectureApache HBaseKubernetesMySQLCloud ComputingAmazon Web Services - $45 hourly
- 0.0/5
- (0 jobs)
A seasoned Software Engineer, with expertise in big data, distributed frameworks, underpinned by a strong grasp of block and object storage systems. She has demonstrated a notable proficiency in optimizing performance for AI/deep learning and data intensive applications over parallel and distributed computing frameworks. My academic journey culminated in a Ph.D. in Computer Engineering from Northeastern University, where my research focused on GPU computing, big data frameworks, and modern storage systems. I addressed data transfer bottlenecks between NVMe SSD storage and GPU, as well as distributed platforms such as Apache Spark and Hadoop. After completing her academic pursuits, I ventured into the professional landscape, start working at Samsung memory solution lab, I honed my expertise in optimizing data pipeline bandwidth for distributed object storage over fabric, specifically for AI/deep learning applications on PyTorch platform.HadoopUnix ShellBash ProgrammingLinuxSSDAWS DevelopmentApache HadoopApache SparkMultithreaded, Parallel, & Distributed Programming LanguageDistributed ComputingOpenCLPyTorchPythonC++CUDA - $60 hourly
- 0.0/5
- (2 jobs)
Generative AI - LLM - STT -TTS - Talking-Face Generation - Chatbot- Spark – SQL –Databricks – ETL – Airflow – Hadoop – Python – DWH – AWS – GCP – MapReduce – BI – Analytics – NoSQL. 𝗢𝗽𝗲𝗻 𝗽𝗿𝗼𝗳𝗶𝗹𝗲 𝘁𝗼 𝘀𝗲𝗲 𝗱𝗲𝘁𝗮𝗶𝗹𝘀. 🤝 𝙒𝙃𝘼𝙏 𝙔𝙊𝙐 𝙂𝙀𝙏 𝙃𝙄𝙍𝙄𝙉𝙂 𝙈𝙀: — Data Engineering and AI Solutions: From crafting sophisticated data platforms tailored to your use case to integrating advanced chatbot solutions, I deliver end-to-end expertise. — Data Scraping and Mining: Extract as much data as you want from any source. — LLM and Chatbot Innovation: Leveraging the latest in AI, I provide guidance in implementing Large Language Models for various applications including conversational AI, enhancing user interaction through intelligent chatbot systems. — BI & Data Visualization: Proficient in tools like Tableau, Power BI, and Looker, I turn complex data into actionable insights. — Automation: Proficient in workflow platforms like Zapier, GHL, and Make.com, I turn complex processes into completed automation workflows. 😉 𝙄 𝙝𝙖𝙫𝙚 𝙚𝙭𝙥𝙚𝙧𝙞𝙚𝙣𝙘𝙚 𝙞𝙣 𝙩𝙝𝙚 𝙛𝙤𝙡𝙡𝙤𝙬𝙞𝙣𝙜 𝙖𝙧𝙚𝙖𝙨, 𝙩𝙤𝙤𝙡𝙨 𝙖𝙣𝙙 𝙩𝙚𝙘𝙝𝙣𝙤𝙡𝙤𝙜𝙞𝙚𝙨: ► AI ENGINEERING LLM models(GPT-x, Llama, LangChain), TTS(Whisper, Deepgram), STT(Bark, ElevenLabs), DeepFake(Wav2lip, Sadtalker), Model Optimization ► BIG DATA & DATA ENGINEERING Apache Spark, Apache Airflow, Hadoop, ClickHouse, Amplitude, MapReduce, YARN, Pig, Hive, HBase, Kafka, Druid, Flink, Presto (incl. AWS Athena) ► ANALYTICS, BI & DATA VISUALIZATION SQL Experienced with complex queries and analytical tasks. BI: Tableau, Redash, Superset, Grafana, DataStudio, Power BI, Looker ► WORKFLOW AUTOMATION Zapier, GoHigheLevel, Make.com, ZoHo, N8N ► OTHER SKILLS & TOOLS Docker, Terraform, Kubernetes, Pentaho, NoSQL databases 𝙈𝙮 𝙧𝙚𝙘𝙚𝙣𝙩 𝙥𝙧𝙤𝙟𝙚𝙘𝙩𝙨: — Real-time crypto status tracking and technical analysis, AI auditor for blockchain code and crypto contracts — GCP-based ML-oriented ETL infrastructure (using Airflow, Dataflow) — Real-time events tracking system (utilizing Amplitude, DataLens, serverless) — Data Analytics platform for CRM analysis of online-game websites — Data visualization project for an E-commerce company 𝙎𝙠𝙞𝙡𝙡𝙨: — Spark expert and experienced Data Engineer 😉 — Extensive experience with MapReduce and BI tools. — Effective communicator, responsible, team-oriented. — Major remote experience: I build an effective work process for a distributed team. — Interested in high load back-end development, ML, and analytical researches. — Data Visualization expert — Data Scraping expertHadoopData VisualizationData ScrapingData WarehousingPythonApache HadoopApache AirflowETLDatabricks PlatformSQLApache SparkChatbotAI Text-to-SpeechAI Speech-to-TextNatural Language ProcessingGenerative AI - $150 hourly
- 0.0/5
- (0 jobs)
USPTO Patent holder for DevSecOps and API Technologies: U.S. Patent No. 11,023,301, U.S. Patent No. 11,010,191. Hadoop, GIS, DevOps, Cloud Operations, Enterprise Architect. Currently Enterprise Architect Consultant, building Multi-Cloud Platforms in support of Studio and Enterprise Operations. Consultant and Architect on Big Data and all major Cloud Compute and Hadoop-based Platforms. Projects * DCGS-A (Army C4ISR) * Nexus-7 (DARPA) * DSCS-III (Army Space Command) * TAP (Technicolor) * BI Operations (Disney) * Cornerstone (Dept. of State) * WARP (Dept. of State * GeoScout (NGA)HadoopComputer ScienceRenderingDevOps EngineeringDevOpsLinuxETL PipelineData WarehousingGeospatial DataEnterprise ArchitectureData AnalysisData ScienceApache Hadoop - $60 hourly
- 0.0/5
- (3 jobs)
Hi, I’m Najeeb Al-Amin I’m a Multimedia / IT / Business Intelligence professional with 4+ years experience in architecting, testing, and deploying highly effective and scalable Big Data solutions. With 15 years of IT and Multimedia experience and also early project experience concentrated on Building Physical Machine Networks, digital automated dialogue replacement (VoiceOver) and Virtual Server Networks under my belt I’m a one stop shop. Also experienced in complex ETL development. Focus installing, designing, configuration and administration of Hadoop architecture as well as Ecosystem components. Building data models to support business reporting and analysis requirements. Highly familiar and experienced with implementing Business Intelligence methodologies in a flexible situation based custom which has become indispensable in architecting top tier information systems. Digital Audiobook, remote and on-site Digital audio workstation engineering, and video tutorial creation are only some of the many ways a strong multimedia background helps me re-shape how I can provide quality services to clients. Effective metadata strategies are key when it comes to being able to deliver solutions. Very knowledgeable as per transfer capabilities .4+ years experience working on mid to small scale data warehouse projects and executing roles such as Big Data Developer, Big Data Consultant, Hadoop Administrator, Hive Developer, Lab Technician and Logistics Consultant.HadoopApache HiveBig DataData AnalysisApache SparkApache HadoopSqoopApache FlumeSystem AdministrationData ModelingApache Kafka - $15 hourly
- 0.0/5
- (0 jobs)
I am a data science and web development enthusiast with exposure to data analysis. I love exploring and hence I am hereHadoopApache HadoopApache KafkaCArtificial IntelligenceAngularJSBootstrap - $25 hourly
- 5.0/5
- (1 job)
PROFESSIONAL SUMMARY * Accomplished in-depth knowledge of Machine Learning (ML) professional skilled in leading strategic early stage and large scale machine intelligence algorithms, aligning machine learning techniques in the field of Additive Manufacturing (AM). * Possess considerable experience across different data types, working on different AI solutions incorporating Recommender System, Computer Vision (CV), and Reinforcement Learning (RL) systems. * Demonstrated strong problem-solving and analytical skills in process troubleshooting, root cause analysis, continuous improvement, and high safety standards.HadoopAmazon Web ServicesHiveApache SparkApache HadoopComputer VisionData AnalysisBig DataApache HiveData Visualization - $90 hourly
- 0.0/5
- (1 job)
I am a highly inquisitive computational biologist with over 12 years of experience in functional genomics and multi-omics analysis (including single-cell sequencing analysis and metabolomics), alongside seven years of experimental design and execution. With both an experimental and computational background, I enjoy working closely with research scientists to test and develop predictive models. In addition, I am passionate about systems biology and the integration of multi-omics datasets to predict the mechanism underlying disease.HadoopMachine Learning ModelApache HadoopBiologyDatabaseCluster ComputingGitSQL ProgrammingLife ScienceBioinformaticsMachine LearningPythonApache HiveSQL - $65 hourly
- 0.0/5
- (0 jobs)
Experienced IT professional with over 9 years in data engineering and data analysis/ETL development. Skilled in designing, developing, and deploying cutting-edge data solutions using big data technologies like Hadoop, PySpark, Hive, and AWS services. Proficient in building scalable ETL pipelines, dimensional modeling, and data migration. Strong track record of translating complex data into actionable insights. collaborative leader adept at guiding cross-functional teams in agile environments.HadoopSnowflakeAmazon RedshiftApache HadooppandasPythonDatabricks PlatformTableauMicrosoft Power BISQLPySpark Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.