Hire the best Apache Kafka Developers in Paris, FR

Check out Apache Kafka Developers in Paris, FR with the skills you need for your next job.
  • $50 hourly
    5+ years experience building end-to-end Data Science, Big Data and AI projects. I can handle every aspect of a project (business requirements, data collection and cleaning, model building and deployment, ...) I am a problem solver, with acute critical thinking and intellectual curiosity, and my keyword is integrity. I am reliable and will always honor my commitments. I will be happy to make your project a success. Skills: - Languages: Python, Scala - Big Data: Spark, Kafka, SQL, NoSQL - AI: Machine Learning, Computer Vision - Other: AWS, Software Engineering
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Docker
    Amazon Web Services
    Apache Hadoop
    Big Data
    Data Engineering
    SQL
    OpenCV
    Scala
    Apache Spark
    Natural Language Processing
    Deep Learning
    Machine Learning
    Data Science
    Python
  • $45 hourly
    Hi, my name is Rafael, and I'm a passionate and experienced software engineer, specializing in backend development. I have just over 8 years experience, working with startups and established companies to build, optimize and scale their software solutions. I'm proficient in several programming languages, including Rust, Python and C++, and I'm always keen to learn and adapt to new technologies and challenges. I have a particular interest in technologies around blockchain and artificial intelligence, and have already completed successful projects in these areas. My main goal as a freelancer is to provide exceptional service to my customers by delivering high-quality, efficient and robust software solutions that meet and exceed their expectations. I value clear communication and meeting deadlines, and I'm known for my problem-solving skills and creative thinking when tackling complex software problems. Whether you need a complete software solution or help with specific aspects of your project, I can provide the experience and dedication to ensure your project's success.
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    SQL Programming
    PostgreSQL
    C++
    Python
    Rust
  • $160 hourly
    I am available for a contract, as Expert Data Architect mastering the value chain going from infrastructure to business valuation through databases and APIs. With a double technical skill, PhD in computer science at ENS-Lyon / INSA-Lyon, and finance, Master 2 in Strategy and Finance at CNAM, I am able to match your technical architectures to your business needs. Having more than 20 years of experience, I have carried out many projects with strong regulatory constraints. I worked for 4 years as CTO of 3 startups [AdNow, Fidzup, WPP / tenthavenue] which developed programmatic platforms. The massive distribution of advertising on the Internet offers significant possibilities for tracking and profiling users. I therefore developed Recommendation System and Profiling (IA/ML/API – Cloud – Python) at high volume (200 million events per day). The emergence of the GDPR has profoundly transformed this sector. I participated in the implementation of these regulations and the creation of new “By Design” compliant solutions. I also carried out a mission of expert [Code Is Law] which develops a database solution, integrating modules allowing to be compliant "By Design" from the design of the data model. These experiences allowed me to address, in addition to the usual questions, DPIA and portability issues. I have accompanied several companies [GeoTwin, Sublime, NDS] on the design of their technical architecture in order to meet international regulatory issues (Facta, Coppa, GDPR, Cloud Act), technical issues (Cloud, Load- Balancing, Consolidated billing, Unified directory), operational issues (Deployment, Versioning, CI/CD) and data issues (Life cycle, Obsolescence, MDM, Object repository). I have a long expert in database management, MySQL, PostgreSQL, RDS RedShift, Oracle, SQL Server. I was the lead architect at the Ministry of Finance to create the PostgreSQL technological framework to replace Oracle. I worked with banks and payment operators to set up and administer high-availability PostgreSQL clusters [Ingénico, Exane]. I also carried out missions on databases with very large volumes (60 TB) and very high availability (RTG 2s) [Anevia]. Having kept an appetite for the operational aspects, I carried out numerous missions of setting up ETL, optimizing DataFlow, deploying decision cubes [Sublime, Afflelou, I@D International] as well as Reverse Engineering and Retro-documentation missions [Valpaco, Bouygues]. In addition, I have some experiences managing Datacenter [UKG / PeopleDoc] and Cloud (DevOps) practices for AWS, Digital Ocean and OVH [WPP / tenthavenue] environments. I managed System (Linux, Debian, OpenBSD, AIX, Solaris), Network (Route 53, VPC, VPN, SSH, SMTP, DNS, SPF, DMARC), Web (LoadBalancer, Nginx, Apache) , Monitoring (CloudWatch, Nagios, Centreon, Zabbix), Software (Mail, Sendmail, Postfix, mailing-list), DevOps (Ansible, Gitlab, CI/CD, Vagrant, Docker), Cloud (CloudWatch, BeanStalk, EC2, S3, RDS, Consolidated Billing) ... I have some knowledge in other specific technologies: Looker, Kafka, Talend. I was a trainer and professor of computer science during several years [INSA Lyon, IUT Bourges, Orsys].
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    AWS Development
    Looker
    Machine Learning
    Linux
    OpenBSD
    MySQL
    Oracle
    PostgreSQL
    Database
    GDPR
    Data Center
  • $100 hourly
    I am an Apache Airflow expert, contributor and trainer (I offer audits on your Airflow and dataeng stack) Dateng trainer - dataops - kafka Data engineer / ML engineer / DataOps, I am available to contribute and/or implement data platform solutions and their infrastructure, such as: - stack design - data acquisition - batch or stream transformation - load into data warehouse - serving by API - training cycle and machine learning monitoring Distributed system and scalability are the subjects on which I can support you. With a strong ops and software (devops) culture, I can help you set up DATA / ML pipelines and APIs: reliable / tested / versioned / monitored. I ensure from a to z the technical stack of features: code, test, git repo, CI, CD, stack (cloud provider / infra), monitoring, alerting, for a truly continuous deployment policy. I am also a fastAPI expert (check my github :))
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Apache Avro
    Docker Compose
    Docker
    Amazon Web Services
    Google Cloud Platform
    Apache Spark
    Kubernetes
    Apache Airflow
  • $30 hourly
    Skills: Programming language:Scala,Java,Python Data engineer: Spark,Kafka,Flink Backend rest API: SpringFarmework, PlayFramework, DB,Postgres,MongoDB,Mysql,Cassandra Devops:Docker,DockerSwarm ,Kubernetes,Terraform Finance: Technical and fundamental market analyse
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Terraform
    Cloudera
    Apache Flink
    Kubernetes
    Docker
    Apache Cassandra
    Apache Spark
    Spring Framework
    Java
    MySQL
    Play Framework
    MongoDB
    PostgreSQL
    Scala
  • $50 hourly
    As a dedicated Data Scientist, I specialize in turning intricate data into clear, actionable insights, enabling businesses to make informed decisions. With a robust background in Python, R, Spark, and various data technologies, I possess the toolkit required to tackle complex data challenges. Strengths and Skills: Proficient in Python, with extensive experience in TensorFlow, Keras, CNN, RNN, PySpark, Pandas, NumPy, and Seaborn. Skilled in R, Apache Spark, Kafka and GCP with a strong foundation in ETL processes. Expertise in SQL (MySQL, Transact-SQL, PostgreSQL) and NoSQL databases (MongoDB, Neo4J, Firebase). Adept at leveraging big data technologies and methodologies to derive meaningful insights. Experience: Internship at Renault as a Data Scientist, focusing on vehicle lifespan analysis. This role enabled me to apply my theoretical knowledge in a practical, impact-driven environment, honing my data manipulation and analysis skills. Notable Projects: Data Science Projects: Time Series Classification: Classified electricity consumption time series and detected activation periods of appliances using Python. Natural Language Processing with Disaster Tweets: Utilized Python to analyze and interpret disaster-related communications. HIV-2 Protease Structural Asymmetry Study: Investigated factors influencing the structural asymmetry of HIV-2 protease using R. Big Data Projects: Forecasting Electricity Consumption: Developed a model to forecast electricity consumption based on weather patterns using Python. Knowledge Graph Evaluation: Studied evaluation methods for a Covid-related knowledge graph using Neo4J and Python. Data Engineering Projects: E-commerce Recommendation Engine: Designed and implemented a recommendation engine using Apache Spark and Kafka. Temporal Entity Extraction Study: Explored temporal entity extraction methods using Python, Spacy, Nltk, and Regex. Data Quality Study: Automated data quality validation processes using Python. Education: Master's degree in Data Science, 2024. I am eager to leverage my skills to help your business uncover valuable insights from your data, driving strategic decisions and fostering growth.
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    MongoDB
    Firebase
    Google Cloud Platform
    Jupyter Notebook
    TensorFlow
    Keras
    NoSQL Database
    SQL
    Apache Spark
    Deep Learning
    Machine Learning
    R
    Python
    Data Science
  • $55 hourly
    Analyste : 10 ans d'expérience dont 8 sur l'EAI WebMethods Il occupe actuellement le poste d'Expert Technique Junior WebMethods sur le site Cap Ampère. Il possède également une solide expérience de deux ans dans le développement de jobs Talend.
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Java
    Unix Shell
    FTP
    Talend Data Integration
    SOAP
    XML
    Apache NiFi
    Python
    PostgreSQL
    ETL
    Oracle Database
  • $20 hourly
    Software engineer with 8+ years experince as both a backend and full stack developer. Working mainly with Java/Spring and React.
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Test Automation Framework
    Hibernate
    Stored Procedure Development
    PostgreSQL
    Oracle
    SQL
    MQTT
    REST API
    React
    Kubernetes
    Docker
    AWS Cloud9
    Spring Boot
    Java
  • $100 hourly
    I'm a FullStack developer for software or web development. My favorite stack is JAVA but I can easily work as FullStack (I already ensured this role in my past experiences) with React, Angular or Vue.js on frontend side. I am also interested in working in C++ or Scala, I will be a beginner on those stack but I am very motivated.
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Scala
    C++
    MongoDB
    SQL
    Angular
    React
    Vert.x
    Spring Boot
    Java
  • $25 hourly
    Results-Driven DevOps Engineer | Driving Continuous Integration, Deployment and Infrastructure Automation Success | SQL server Administration
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Microsoft SQL Server Administration
    Grafana
    Elasticsearch
    Prometheus
    Docker Compose
    Docker
    SonarQube
    Microservice
    CI/CD
    Terraform
    Jenkins
    Git
    Kubernetes
    DevOps Engineering
  • $30 hourly
    I’m a Java Developer and I can help you in Core Java & Java GUI Programming Projects including Java Springboot. My Expertise: •Data Structures •OOP (Object Oriented Programming) •Methods •Databases •Graphical user Interfaces (GUI) •Command Line Interfaces (CLI) •Exception Handling •File Reading/Writing
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Oracle
    Apache Maven
    MVC Framework
    Java
    Spring Boot
    Software
  • $30 hourly
    Welcome to my Upwork profile! 🚀 With an engineering degree from the École des Mines Paris, I'm a versatile Data Maestro specialized in cutting-edge technologies. My academic background combined with professional experience has endowed me with solid expertise in data engineering and analysis. Here are the types of missions I regularly undertake: Data Engineer 🚀 😎 Proficient in using Airflow to orchestrate complex workflows 😊 Precise data modeling and construction of reliable pipelines to the data lake 🚀 Utilization of Apache Kafka for real-time data streaming 🐍 Storage of unstructured data with MongoDB 🐳 Deployment of Docker and Kubernetes environments for portability and scalability 🤖 Expertise in Python for development 📦 Seamless integration of machine learning with MLOps Data Analyst 💻 📊 Transforming data into impactful visualizations with Elastic Stack (ELK) 😊 Expertise in Python for data analysis 🔍 Guidance at every step of the process, from data integration to in-depth exploration Cloud Architect AWS ☁️ 🌐 Building scalable architectures in the AWS cloud ☁️ Deployment of Amazon Web Services (AWS) 🛠️ Implementation of Continuous Integration (CI) and Continuous Delivery (CD) in AWS ☁️ Deployment in Google Cloud Platform (GCP) (Note: AWS is mentioned here but can be adapted for GCP if needed) To kickstart your project and benefit from comprehensive expertise from frontend to backend, feel free to contact me today! 🌟
    vsuc_fltilesrefresh_TrophyIcon Apache Kafka
    Machine Learning
    NoSQL Database
    SQL
    Node.js
    React
    Back-End Development
    Apache Airflow
    Python
    Big Data
    Artificial Intelligence
    Amazon Web Services
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Kafka Developer near Paris, on Upwork?

You can hire a Apache Kafka Developer near Paris, on Upwork in four simple steps:

  • Create a job post tailored to your Apache Kafka Developer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Kafka Developer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Kafka Developer profiles and interview.
  • Hire the right Apache Kafka Developer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Kafka Developer?

Rates charged by Apache Kafka Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Kafka Developer near Paris, on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Kafka Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Kafka Developer team you need to succeed.

Can I hire a Apache Kafka Developer near Paris, within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Kafka Developer proposals within 24 hours of posting a job description.