Hire the best Apache Zookeeper developers

Check out Apache Zookeeper developers with the skills you need for your next job.
  • $110 hourly
    Experienced software engineer and AWS DevOps specialist with a Bachelor’s degree in Computer Science and over 17 years in the industry developing SAAS applications and infrastructure systems. I've worked in fortune 500 companies, brand new startups, and everything in between. I currently specialize in AWS infrastructure and devops, as well as PCI and HIPAA compliance. Certified AWS Cloud Practitioner and AWS Solutions Architect Professional
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Amazon ECS
    AWS Fargate
    AWS CodePipeline
    AWS CloudFront
    Linux System Administration
    AWS CloudFormation
    AWS Lambda
    Docker
    Kubernetes
    Amazon Web Services
    PostgreSQL
    SQL
    Java
  • $100 hourly
    Skills: Cloud: AWS, Aliyun/AliCloud, Digital Ocean, Tencent IAC: Terragrunt, Terraform, Packer Configuration Management: Ansible, Chef CI/CD: GitHub Actions, Jenkins, Code Pipeline/Commit/Build/Deploy, Teamcity Docker orchestration: Kubernetes (Certified LFS158x), ECS (Fargate and EC2 ECS) Monitoring software: ELK (Elasticsearch, Logstash, Kibana), Cacti Security: ZTNA (Zero Trust Network Access), Cisco, PFSense, Meraki, SonicWall, IPTables, VPN, OS: Linux Debian/Ubuntu CentOS Programming: Bash, Python Others: Git/Github, Azure AD, Strong Swan, OpenVPN, NetApp7, Synology, Zookeeper, Kafka, SonarQube, Nexus DateCentre Network Administrator: Routing,Switching: VPLS/EoMPLS, VRF, HSRP, STP (Cisco CCNP certified) Firewall management: ASA, Checkpoint, Symantec, SonicWall, S2S/SSL VPN Storage administration: NetApp, GlusterFS Monitoring: Cacti, Solarwinds, Munin OS: Linux CentOS, Debian/Ubuntu
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Cisco Certified Network Professional
    Linux System Administration
    Alibaba Cloud
    Amazon
    DevOps
    Firewall
    ELK Stack
    Ansible
    Terraform
    Progress Chef
  • $130 hourly
    20 years of IT experience with a passion for technology. If you have a challenging project, I am your man. Git Profile for sample work and coding style: github.com/lucidprogrammer Core areas of expertise : 1) Identity and Access Management(IAM) 2) Deep Learning ( text processing - NLP ) 3) DevOps 4) Blockchain (Hyperledger Fabric) Programming Languages: - Nodejs - Python - Elixir - Go - Java - C/C++ Identity and Access Management(IAM): - Single Sign On (SSO) using SAML, OAuth, OpenIDConnect. - Auth0,Okta,Keycloak etc plus custom built on most programming languages. - Proxy level IAM (Istio on Kubernetes) Deep Learning/Data Science: Focus Areas: NLP, OCR Tools/Frameworks: Tensorflow, spacy, gensim, prodigy Programming Language: Python Devops: By default uses Kubernetes and anything in that industry for solving devops problems. Orchestration: Kubernetes, Docker Swarm DevOps Tools: Terraform, Chef, Docker, Webpack DevOps Platforms: AWS, Azure, GCP DevOps Programming: bash ruby python Blockchain: Mainly in Hyperledger family of solutions. Passionate about Application/Startup Projects: If you have a challenging project to kick start where you have complex backend to be created using Nodejs/PHP/Java/Elixir/Python and scalable front end using React, (I use Reactjs, Rx & Mobx) happy to take up. Expect functional output ready for styling and image creation by a designer. Complex integration projects: Interested in complex integration projects involving SOAP, REST and API integrations, I could be a good fit.
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    DevOps
    Amazon Web Services
    Terraform
    Node.js
    Artificial Intelligence
    Docker
    Automated Monitoring
    Bash Programming
    PHP
    Python
    TensorFlow
    Java
  • $55 hourly
    I am a highly skilled professional with extensive experience in DevOps, specifically in Azure. In addition, I have expertise in various areas such as ERP, SAAS, CRM, Java Full Stack Development, and Yardi Voyager consulting. With a strong background in DevOps, I am adept at developing and implementing robust deployment pipelines, monitoring systems, and automation tools. I have experience in designing and deploying cloud infrastructure, working with containers, and managing DevOps workflows. I am also well-versed in various enterprise software solutions such as ERP, SAAS, and CRM. With expertise in Java Full Stack Development, I can develop end-to-end solutions that cater to the specific needs of clients. As a Yardi Voyager consultant, I have helped clients in implementing and customizing the software to meet their business requirements. Overall, I am a skilled and experienced professional with a diverse set of skills that can cater to the needs of various clients. With a client-focused approach and a commitment to delivering high-quality solutions, I am an asset to any project. Freelance Yardi Consultant (Yardi Breeze, Yardi Bookkeeping, Yardi Voyager) Collaborated with clients to understand their business requirements and provided tailored solutions using Yardi software such as Yardi Breeze, Yardi Bookkeeping, and Yardi Voyager. Configured and customized the software to meet the specific needs of clients and integrated it with other systems as required. Provided training and support to end-users to ensure smooth adoption of the software. Developed and maintained documentation, including functional and technical specifications, training materials, and user manuals. Delivered projects within the agreed timeline and budget, while ensuring high-quality solutions that met the clients' expectations. Communicated regularly with clients to provide project updates, gather feedback, and address any concerns. During my time at Discover Financial Services (USA), I worked as a DevOps Engineer as part of a team responsible for supporting the Java environment and managing deployments. My primary focus was to resolve issues in live production environments to minimize customer impact and prevent losses. I used various tools such as AppDynamics, Kibana, Postman, SOAPUI, Secure-CRT, VMware PCF, Autosys, ServiceNow, and IBM WebSphere to monitor and manage the production environment. As a result, I was able to quickly identify and resolve issues to ensure the smooth functioning of the system. I maintained regular communication with clients and senior management to provide updates on the project's progress and to gather feedback and address any concerns. My work involved working on both Windows 10 and UNIX/Linux batch servers and Java-based applications. I leveraged my technical expertise to provide optimal solutions that met the clients' needs. Overall, I played a critical role in ensuring the successful functioning of Discover Bank IT Credit Card for Java, API (PCF), and backend (Server) issues. During my tenure at Capgemini, I was involved in developing an application for a flight booking system. As a Java Developer, I utilized my skills in Spring Boot to create a robust backend that could handle large volumes of data. To enhance the user experience, I also used Angular to create a user-friendly frontend. This involved designing and developing various features such as search, booking, and payment modules. Throughout the project, I collaborated with other team members to ensure timely delivery of the project. I also worked closely with the client to gather requirements, provide updates, and gather feedback to ensure the project met their expectations. Overall, my expertise in Java development and frontend design helped to create a high-quality flight booking system that was user-friendly and scalable.
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    NestJS
    Apache Kafka
    Angular
    Azure DevOps
    Yardi Software
    Microsoft Azure
    Java
    Node.js
    Kubernetes
    Jenkins
    Docker
  • $60 hourly
    I have worked as both a Software Engineer and Business Analyst (requirements elicitation and documentation), on various projects, for over 9 years. Most of my work have been within the financial services domain. I have worked on web payment gateway integrations, business automation solutions, banking applications and more. I love to take on challenging projects. Good software, to me, can only be built by working from a user's perspective and maintaining strict compliance to software development best practices. Today, i mostly design and develop Web applications and business automation platforms (including backend services and APIs) using Microsoft .NET frameworks (like MVC, ASP.NET, Entity Framework etc) and Node.js; "Permissioned" blockchain networks/applications on Hyperledger Fabric, using Node.js and Golang; As well as Sentiment analysis, prediction and image classification solutions using Keras on Tensorflow.
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Blockchain Architecture
    Hyperledger Fabric
    Blockchain
    Blockchain Development
    Corda
    Product Design
    FinTech Consulting
    Node.js
    Golang
    TensorFlow
    C#
    Java
  • $65 hourly
    Skills : Kafka, Kafka streams, Confluent, Spring Kafka , Spring boot , Spring security, Java 11 , REST, AWS, Docker Experience : 20 yrs Companies/clients worked with : ING, G-Star, equensWorldLine, ABN AMRO, Deutsche Bank, Vodafone, Accenture, TCS, UTI Project domains : Banking, Telecom, Retail Role : Dev & Ops Responsibilities : - Design & Development - Support Education : Bachelors of Engineering Accomplishments : Several innovation awards Please look at my LinkedIn Profile for more info.
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    AWS Application
    Apache Kafka
    Java
    Microservice
  • $26 hourly
    I am a full-stack developer with 7+ years of working experience and am passionate about quality work, user experiences, clean code, and accessibility. I've experience working with complex SaaS projects and always respect deadlines. 💻 My Tech Skill: ✅ React.js, React Hooks, Redux, Redux-Saga, Redux-Thunk ✅ Vue js, Vuex, Vue-Chart, Quasar framework. ✅ React native, Flutter (Dart), Alpine.js. ✅ D3 charts, Braintree, Slickjs, periodic ✅ Bootstrap, Ant design, Liquid, Google Analytics ⭐ My key areas of expertise: ✔ Web & Mobile Application Development ✔ Material UI, Bootstrap, TailwindCSS ✔ REST API, Websocket ✔ Pixel-perfect apps from Sketch, Photoshop, Figma, or Zeppelin ✔ SSR apps with Next.js ✔ Flavours for android and schemes for iOS. ✔ API Integration ✔ Configure development, staging, and production app in the firebase. ✔ Real Time Tracking ✔ Google Maps, Admob, Geolocation, Analytics ✔ Barcode Scanner App ✔ Publishing app to stores Google Play, and Apple Store. ✔ Online / offline apps ✔ Firebase SDK integration ⭐Database: ✔ PostgreSQL, ✔ MongoDB, ✔ MySQL. ✔ Dynamodb ✔ GraphQL ⭐ Tools Used: ✔ Jira / Sublime ✔ Docker / Jenkins ✔ GitHub / BitBucket ✔ AWS Console / Azure DevOps ⭐ Cloud Services: ✔ Amazon Web Services, ✔ Amazon S3 ✔ Firebase. ✔ DevOps 🟢 Recently worked on NFT/Cryptocurrency project. 🟢 Architecting JavaScript apps with React and GraphQL. Let's connect together and discuss your ideas and business application. Warm Regards Kishore
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    NoSQL Database
    Apache Cordova
    Backbone.js
    Vue.js
    ASP.NET
    Python
    Flutter
    React Native
    Firebase
    React
    MongoDB
    AngularJS
    Redis
    Node.js
    MySQL
  • $175 hourly
    Mr. Joshua B. Seagroves is a seasoned professional having served as an Enterprise Architect/Senior Data Engineer for multiple Fortune 100 Companies. With a successful track record as a startup founder and CTO, Mr. Seagroves brings a wealth of experience to his role, specializing in the strategic design, development, and implementation of advanced technology systems. Throughout his career, Mr. Seagroves has demonstrated expertise in architecting and delivering cutting-edge solutions, particularly in the realm of data engineering and sciences. He has successfully spearheaded the implementation of multiple such systems and applications for a diverse range of clients. As part of his current responsibilities, Mr. Seagroves actively contributes to the prototyping and research efforts in the field of data engineering/data science, specifically in the development of operational systems for critical mission systems. Leveraging his extensive background in architecture and software modeling methodologies, he has consistently led and collaborated with multidisciplinary teams, successfully integrating various distributed computing technologies, including Hadoop, NiFi, HBase, Accumulo, and MongoDB. Mr. Seagroves' exceptional professional achievements and extensive experience make him a highly sought-after expert in his field. His comprehensive knowledge and hands-on expertise in advanced technology systems and big data make him a valuable asset to any organization.
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    YARN
    Apache Hadoop
    Big Data
    TensorFlow
    Apache Spark
    Apache NiFi
    Apache Kafka
    Artificial Neural Network
    Artificial Intelligence
  • $35 hourly
    As a Senior Software Engineer, I specialize in architecting and developing web and AI applications. My expertise includes Microservice Architecture, LLM fine-tuning, and RAG chat bots. I have successfully implemented Search-Based Applications, Streaming applications, and Recommender System. Certified as an AI developer by deeplearning.AI and IBM, my proficiency extends to ML and Deep Learning. I consistently adhere to best coding principles, design patterns, delivering optimized, scalable, and maintainable code. My Tech Stack: Programming Languages: Java, Nodejs, Python Web Frameworks: Spring Cloud, Spring, Spring Boot, Flask, Express, Hibernate, Spring cloud gateway, Resilience4J, Netflix-Eureka, Spring Cloud Stream, Django ,JDBC, Maven, Gradle, Spring web flux Technologies: Docker, Zookeeper, Jenkins, Prometheus, Grafana, Kibana, Elastic APM, Sleuth, Logstash Search Engines: Lucene, Elasticsearch, Apache Solr Data Streaming: Kafka, RabbitMQ Databases: Mysql, SQLite, Redis, MongoDB, Amazon S3 Big Data: Hadoop, Apache Spark Cloud Platforms: AWS, Google Cloud Data Analytics Libraries: Panda, Numpy, Tensorflow LLM Integeration: Langchain, Open AI ChatGPT Summary of Projects: ChatGPT Integration: Integrated ChatGPT seamlessly with a .NET/Angular-based web application to empower users in creating strategies and PVVV statements through intuitive prompts. Retrieval Augmented Generative Chatbot: Developed a context-aware Q&A bot using Langchaing and ChatGPT, ensuring an engaging and intelligent conversational experience. Double Entry Banking Application (Bukuwarung): Successfully implemented a robust Double Entry-based Banking application for Bukuwarung in Java, Spring Boot, Spring Web Flux, and Postgres. Migration to Microservice Architecture(Times Internet): Led the migration of a monolithic application to a Microservice architecture at Times Internet, leveraging the Spring Cloud ecosystem for enhanced scalability and performance. Realtime Popularity-Based Product Sorting (Deutsche Telekom Digital Limited): Introduced a Realtime Popularity-Based Sorting feature for product listings at Deutsche Telekom Digital Limited, enhancing user experience with Java, Redis, MySQL, Spring, Hibernate, and Spring Cloud. Microservice Development (Deutsche Telekom Digital Labs): Spearheaded the creation of a Microservice from scratch, implementing Auto Suggest, SRP, and Product discovery for Deutsche Telekom Digital Labs, providing efficient product search and recommendation capabilities. Event-Driven Architecture (Apache Kafka): Implemented an Event-Driven Architecture based on Apache Kafka, enabling asynchronous communication between microservices, improving system responsiveness, and scalability. Search Listing Page (Times Internet): Successfully delivered a Search Listing page using Elasticsearch and Apache Solr at Times Internet, significantly enhancing search functionality and user experience. Auto Suggest for Search Bar (Dineout): Implemented an Auto Suggest feature for the search bar at Dineout, contributing to an improved user search experience. Recommendation API: Contributed to the development of a robust Recommendation API using Apache Solr, incorporating advanced algorithms for "You-May-Like" suggestions. AngularJS to Angular2+ Migration (Kpifire Project): Successfully migrated AngularJS to Angular 15 for the Kpifire project, ensuring a smooth transition and enhanced performance.
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Apache Solr
    LangChain
    Kotlin
    Spring Boot
    LLM Prompt Engineering
    Apache Kafka
    Hibernate
    Angular
    Elasticsearch
    ChatGPT
    AI Development
    Machine Learning
    Recommendation System
    Deep Learning
    Java
    Python
  • $70 hourly
    🎓 𝗖𝗲𝗿𝘁𝗶𝗳𝗶𝗲𝗱 𝗗𝗮𝘁𝗮 𝗣𝗿𝗼𝗳𝗲𝘀𝘀𝗶𝗼𝗻𝗮𝗹 with 𝟲+ 𝘆𝗲𝗮𝗿𝘀 of experience and hands-on expertise in Designing and Implementing Data Solutions. 🔥 4+ Startup Tech Partnerships ⭐️ 100% Job Success Score 🏆 In the top 3% of all Upwork freelancers with Top Rated Plus 🏆 ✅ Excellent communication skills and fluent English If you’re reading my profile, you’ve got a challenge you need to solve and you are looking for someone with a broad skill set, minimal oversight and ownership mentality, then I’m your go-to expert. 📞 Connect with me today and let's discuss how we can turn your ideas into reality with creative and strategic partnership.📞 ⚡️Invite me to your job on Upwork to schedule a complimentary consultation call to discuss in detail the value and strength I can bring to your business, and how we can create a tailored solution for your exact needs. 𝙄 𝙝𝙖𝙫𝙚 𝙚𝙭𝙥𝙚𝙧𝙞𝙚𝙣𝙘𝙚 𝙞𝙣 𝙩𝙝𝙚 𝙛𝙤𝙡𝙡𝙤𝙬𝙞𝙣𝙜 𝙖𝙧𝙚𝙖𝙨, 𝙩𝙤𝙤𝙡𝙨 𝙖𝙣𝙙 𝙩𝙚𝙘𝙝𝙣𝙤𝙡𝙤𝙜𝙞𝙚𝙨: ► BIG DATA & DATA ENGINEERING Apache Spark, Hadoop, MapReduce, YARN, Pig, Hive, Kudu, HBase, Impala, Delta Lake, Oozie, NiFi, Kafka, Airflow, Kylin, Druid, Flink, Presto, Drill, Phoenix, Ambari, Ranger, Cloudera Manager, Zookeeper, Spark-Streaming, Streamsets, Snowflake ► CLOUD AWS -- EC2, S3, RDS, EMR, Redshift, Lambda, VPC, DynamoDB, Athena, Kinesis, Glue GCP -- BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Data Fusion Azure -- Data Factory, Synapse. HDInsight ► ANALYTICS, BI & DATA VISUALIZATION Tableau, Power BI, SSAS, SSMS, Superset, Grafana, Looker ► DATABASE SQL, NoSQL, Oracle, SQL Server, MySQL, PostgreSQL, MongoDB, PL/SQL, HBase, Cassandra ► OTHER SKILLS & TOOLS Docker, Kubernetes, Ansible, Pentaho, Python, Scala, Java, C, C++, C# 𝙒𝙝𝙚𝙣 𝙮𝙤𝙪 𝙝𝙞𝙧𝙚 𝙢𝙚, 𝙮𝙤𝙪 𝙘𝙖𝙣 𝙚𝙭𝙥𝙚𝙘𝙩: 🔸 Outstanding results and service 🔸 High-quality output on time, every time 🔸 Strong communication 🔸 Regular & ongoing updates Your complete satisfaction is what I aim for, so the job is not complete until you are satisfied! Whether you are a 𝗦𝘁𝗮𝗿𝘁𝘂𝗽, 𝗘𝘀𝘁𝗮𝗯𝗹𝗶𝘀𝗵𝗲𝗱 𝗕𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗼𝗿 𝗹𝗼𝗼𝗸𝗶𝗻𝗴 𝗳𝗼𝗿 your next 𝗠𝗩𝗣, you will get 𝗛𝗶𝗴𝗵-𝗤𝘂𝗮𝗹𝗶𝘁𝘆 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀 at an 𝗔𝗳𝗳𝗼𝗿𝗱𝗮𝗯𝗹𝗲 𝗖𝗼𝘀𝘁, 𝗚𝘂𝗮𝗿𝗮𝗻𝘁𝗲𝗲𝗱. I hope you become one of my many happy clients. Reach out by inviting me to your project. I look forward to it! All the best, Anas ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Muhammad is really great with AWS services and knows how to optimize each so that it runs at peak performance while also minimizing costs. Highly recommended! ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ You would be silly not to hire Anas, he is fantastic at data visualizations and data transformation. ❞ 🗣❝ Incredibly talented data architect, the results thus far have exceeded our expectations and we will continue to use Anas for our data projects. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ The skills and expertise of Anas exceeded my expectations. The job was delivered ahead of schedule. He was enthusiastic and professional and went the extra mile to make sure the job was completed to our liking with the tech that we were already using. I enjoyed working with him and will be reaching out for any additional help in the future. I would definitely recommend Anas as an expert resource. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Muhammad was a great resource and did more than expected! I loved his communication skills and always kept me up to date. I would definitely rehire again. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Anas is simply the best person I have ever come across. Apart from being an exceptional tech genius, he is a man of utmost stature. We blasted off with our startup, high on dreams and code. We were mere steps from the MVP. Then, pandemic crash. Team bailed, funding dried up. Me and my partner were stranded and dread gnawed at us. A hefty chunk of cash, Anas and his team's livelihood, hung in the balance, It felt like a betrayal. We scheduled a meeting with Anas to let him know we were quitting and request to repay him gradually over a year, he heard us out. Then, something magical happened. A smile. "Forget it," he said, not a flicker of doubt in his voice. "The project matters. Let's make it happen!" We were floored. This guy, owed a small fortune, just waved it away? Not only that, he offered to keep building, even pulled his team in to replace our vanished crew. As he spoke, his passion was a spark that reignited us. He believed. In us. In our dream. In what he had developed so far. That's the day Anas became our partner. Not just a contractor, but a brother in arms. Our success story owes its spark not to our own leap of faith, but from the guy who had every reason to walk away. Thanks, Anas, for believing when we couldn't.❞
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Solution Architecture Consultation
    AWS Lambda
    ETL Pipeline
    Data Management
    Data Warehousing
    AWS Glue
    Apache Spark
    Amazon Redshift
    ETL
    Python
    SQL
    Marketing Analytics
    Big Data
    Data Visualization
    Artificial Intelligence
  • $75 hourly
    ✌️🙂 Hey there! I'm a data scientist and a researcher who is engaged in the world of artificial intelligence. As a data scientist, I focus on deep learning and computer vision. I have experience building models for object detection, segmentation, and classification. As a researcher, I am always eager to build new things. So whether your data is pictures, 3-dimensions, text, video, audio, time-series, or anything else, I am happy to work on it. 🌟 I aim to give client's the best freelance experience by providing the perfect solution with outstanding communication, clear expectations, and satisfying deliverables using state-of-the-art tools. When we work together, you will receive the best practices in source code management, version control, and security access to your codebase. Additionally, my team can provide infrastructure for training, testing, and hosting for any model or application. ⌨️ Technologies: Core: Linux/Debian/MacOS/Windows, Python 3, Markdown, Git Languages: Python, Java, HTML, CSS, JavaScript, ReactJS, VueJS Python: Django Rest Framework, Django, Flask, Wagtail, Celery, Kafka, Selenium, Matplotlib, Seaborn, Plotly Data Manipulation: Numpy, Pandas, OpenCV, PyDICOM Data Science: Scikit-Learn Deep Learning: Tensorflow, Keras, PyTorch Data Visualization: Matplotlib, Seaborn, Plotly Databases: MySQL, PostgreSQL, SQLite, Redis, MongoDB DevOps: Kubernetes, Microservices, Docker, Docker-compose, GCP, AWS Other technologies: Jenkins, Spinnaker, Nginx, Minio S3 object storage, Airflow, RabbitMQ, Rook, Celery, GlusterFS, Spark, Kafka + Zookeeper, OpenFaaS, Knative, Gunicorn Some of my 💛 favorite 💛 projects are: 🧠 Sonador | Realtime Medical Workflows Medical imaging is a powerful ability to harness as a healthcare enterprise. Using machine learning to process medical data, organizations can improve their client's workflows by speeding up the process of diagnosing patients accurately at orders of magnitude faster than ever. Machine vision software uses computational techniques to allow computers to analyze pictures, special features of interest, and use that data to reveal insight. 3D medical imaging provides context and diagnostic value by giving clinicians more than a flat stack of images. When combined with additional data, which itself can be pulled from the models, it is possible to create powerful visualizations, improve diagnostic accuracy, or more effectively educate patients. 🧬 Chest X-Ray Classification | Detecting COVID-19 and other diseases with AI Computer vision allows us to train a computer to recognize patterns in images that are invisible to the naked eye. This dramatically speeds up the daily workflow of radiologists, especially in times of off-hours or crisis. Using AI to help screen for COVID-19 cases has been a very ambitious initiative that receives strong support. Research published early in the pandemic provided evidence that severe COVID cases could be detected through the use of chest x-rays (CXR). Given the early shortage of testing and delays in receiving PCR results, CXR looked like a promising technology for helping physicians triage patients within minutes rather than hours or days. 🖥 Big Data DevOps | Bare-metal Kubernetes Platform Deployment Operations Infrastructure | Deployed a secure, highly available Kubernetes platform for big data streaming and processing using bare-metal Kubernetes. Containers deployed with Docker: Python(Django, OAuth2), Celery, PostgreSQL, Airflow, RabbitMQ, GlusterFS, Rook, MinIO S3, Kafka + Zookeeper, OpenFaaS, Gunicorn. 📝 Writer | Published Works Docker for Continuous Integration Online course for learning how to build and deploy applications with Docker and configure continuous integration pipelines. Docker, Continuous integration, Container-driven development, Container networking, Multi-container applications, DevOps, Jenkins, Orchestration, Container runtime. Evolving Architectures Whitepaper on implementing microservices and DevOps with containers. Technical writing and demonstration on cloud native, DevOps, microservices and containers. Kuberenetes, Docker, Orchestration, Container-driven development, VCS (Git), CICD. 📞 Let's schedule a call and get your next project rolling!
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Data Management
    Data Scraping
    Jupyter Notebook
    DICOM
    Statistics
    DevOps
    Medical Imaging
    Kubernetes
    Machine Learning
    Keras
    Model Tuning
    Python
    PyTorch
    TensorFlow
    Deep Learning
  • $45 hourly
    Experienced Senior Software Engineer with a demonstrated history of working in the information technology and services industry. I worked with CISCO, HP, Ruckus Wireless and Nokia. Strong engineering professional with a Bachelor's degree focused in Computer Systems Engineering from Birzeit University.
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Event-Driven Programming
    Spring Boot
    JUnit
    PostgreSQL
    Hibernate
    RabbitMQ
    Elasticsearch
    Apache Kafka
    RESTful API
    Docker
    Python
    Java
    Google Cloud Platform
    Kubernetes
  • $29 hourly
    *Experience* • Have hands-on experience upgrading the HDP or CDH cluster to Cloudera Data Private Cloud Platform [CDP Private Cloud]. • Extensive experience in installing, deploying, configuring, supporting, and managing Hadoop Clusters using Cloudera (CDH) Distributions and HDP hosted on Amazon web services (AWS) cloud and Microsoft Azure. • Experience in pgrading of Kafka, Airflow and CDSW • Configured various components such as HDFS, YARN, Sqoop, Flume, Kafka, HBase, Hive, Hue, Oozie, and Sentry. • Implemented Hadoop security. • Deployed production-grade Hadoop cluster and its components through Cloudera Manager/Ambari in a virtualized environment (AWS/Azure Cloud) as well as on-premises. • Configured HA for Hadoop services with backup & Disaster Recovery. • Setting Hadoop prerequisites on Linux server. • Secured the cluster using Kerberos & Sentry as well as Ranger and tls. • Experience in designing and building scalable infrastructure and platforms to collect and process very large amounts of structured and unstructured data. • Experience in adding and removing nodes, monitoring critical alerts, configuring high availability, configuring data backups, and data purging. • Cluster Management and troubleshooting on the Hadoop ecosystem. • Performance tuning, and solving Hadoop issues using CLI, CMUI by apache WebUI. • Report generation of running nodes using various benchmark operations. • Worked on AWS services such as EC2 instances, S3, Virtual private cloud, Security groups, and Microsoft Service like resource groups, resources (VM, disk, etc.), Azure blob storage, Azure storage replication. • configure private and public IP addresses, network routes, network interface, subnets, and virtual network on AWS/Microsoft Azure. • Troubleshooting, diagnosing, performance tuning, and solving the Hadoop issues. • Administration of Linux installation. • Fault finding, analysis and logging information for report. • Expert in administration of Kafka and deploying of UI tools to manage Kafka • Implementing HA for MySQL • Installing/Configuring Airflow for orchestration of jobs
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Apache Kafka
    Apache Hive
    Apache Airflow
    Apache Spark
    YARN
    Hortonworks
    Apache Hadoop
    Cloudera
    Apache Impala
  • $35 hourly
    I am a dedicated DevOps professional with a strong emphasis on crafting and implementing highly scalable, dependable, secure, agile, and cost-efficient solutions. My technical acumen spans a broad spectrum of tools and technologies, including: AWS Cloud Infrastructure Deployment: Proficient in orchestrating end-to-end infrastructure deployments on the AWS cloud, ensuring the creation of robust and adaptable cloud solutions. Infrastructure as Code (IAC): Adept at applying Infrastructure as Code principles, leveraging Terraform to deploy infrastructure efficiently and adhere to best practices. Kubernetes and Helm: Skilled in Kubernetes, with the ability to write Helm charts for deploying Kubernetes manifests, experienced with Amazon Elastic Kubernetes Service (EKS). Database Expertise: Possess a versatile background in database management, including MySQL, MongoDB, PostgreSQL, RDS, and Scylla DB, enabling efficient data handling and maintenance. Containerization and Orchestration: Proficiency in Docker containers and container orchestration using Kubernetes, optimizing application deployment and enabling seamless scaling. Continuous Integration/Continuous Delivery (CI/CD): Proficient in designing and automating deployments using Jenkins, GitOps principles, and ArgoCD to streamline deployment processes for faster and more reliable releases. Log Management and Monitoring: Expertise in centralized log management tools such as ELK (Elasticsearch, Logstash, Kibana) for effective monitoring, analytics, and troubleshooting. Configuration Management: Competent in configuration management with Ansible, ensuring consistency and reliability across infrastructure components. Event Streaming: Proficient in leveraging Kafka for queuing tasks and streamlining data processing, enabling efficient data flow and real-time processing. Tool Mastery: Mastery of a variety of Continuous Delivery/Continuous Integration tools, including AWS, Docker, Kubernetes, Terraform, Packer, Jenkins (with dynamic job creation), Sonarqube, Maven, Node, Nginx, and more. Log Analytics and Monitoring: Adept at utilizing ELK for effective log analytics and monitoring, with a deep understanding of log data to proactively address issues and enhance system performance. Additional Skills: Familiarity with monitoring tools like Datadog and Prometheus, Grafana along with a strong command of Linux, further enhancing system performance and reliability. In addition to my technical proficiencies, I hold certifications as a RedHat Certified System Administrator and an AWS Certified Solutions Architect Associate, underscoring my commitment to staying up-to-date with industry standards and best practices. I am deeply passionate about tackling complex challenges and driving continuous improvement in IT operations. Whether the goal is optimizing infrastructure, automating processes, or fortifying security measures, I bring a wealth of experience and a proven track record of delivering tangible results. Beyond the technical skills highlighted above, I have significant experience in team building and management. I've successfully assembled and led teams, with a notable example being my team of 12 engineers at Paytm and Rackspace. If you are seeking a dedicated DevOps professional capable of efficiently and effectively helping you achieve your project objectives, I am eager to discuss how I can contribute to your success. Certifications: RedHat Certified System Administrator AWS Certified Solutions Architect Associate
    vsuc_fltilesrefresh_TrophyIcon Apache Zookeeper
    Microservice
    Container
    Linux System Administration
    Cloud Computing
    Apache Kafka
    Git
    Bash
    CI/CD
    Jenkins
    Amazon Web Services
    Python
    Ansible
    Kubernetes
    Docker
  • Want to browse more freelancers?
    Sign up

How it works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Zookeeper Developer on Upwork?

You can hire a Apache Zookeeper Developer on Upwork in four simple steps:

  • Create a job post tailored to your Apache Zookeeper Developer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Zookeeper Developer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Zookeeper Developer profiles and interview.
  • Hire the right Apache Zookeeper Developer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Zookeeper Developer?

Rates charged by Apache Zookeeper Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Zookeeper Developer on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Zookeeper Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Zookeeper Developer team you need to succeed.

Can I hire a Apache Zookeeper Developer within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Zookeeper Developer proposals within 24 hours of posting a job description.

Schedule a call