Hire the best Apache Spark Engineers in Morocco
Check out Apache Spark Engineers in Morocco with the skills you need for your next job.
- $20 hourly
- 5.0/5
- (4 jobs)
👋 Hello there! I'm a highly skilled and independent Full-Stack engineer, fueled by a deep passion for building exceptional web platforms. 🚀 With an unwavering attention to detail, I pride myself on crafting clean code that adheres to industry standards and embraces best practices and design principles. Here's why working with me brings a multitude of benefits: ✨ Immaculate Code: I adhere to language standards and meticulously craft clean, maintainable code that is easy to understand and enhance. This approach accelerates development, minimizes errors, and facilitates future scalability. 📞 Seamless Communication: I prioritize clear and transparent communication throughout our collaboration. By establishing a strong line of contact, I ensure you are continuously updated on project progress. Long-term relationships matter to me, and I remain committed to seeing projects through to completion without any sudden drop-offs. ⚡ Performance Optimization: I possess extensive expertise in optimizing performance and scalability for web projects. With my guidance, your platforms will be finely tuned to handle increased traffic, delivering a flawless user experience even during peak periods. During my career, I have successfully worked with a diverse range of technologies. Here are some of my core proficiencies: 🌐 Frontend: I excel in React and Angular, enabling me to create stunning and intuitive user interfaces that captivate users and enhance their journey. 💻 Backend: I'm well-versed in NodeJS, Spring Boot, and Go, enabling me to architect robust and scalable server-side applications that power your web platforms. 🗄️ Database: I bring experience with MySQL, PostgreSQL, MongoDB, and Redis, allowing me to design and implement efficient and reliable data storage solutions tailored to your specific requirements. 🚀 DevOps: Proficiency in technologies such as Kubernetes, Docker, and GitLab empowers me to streamline the deployment and management of your applications, fostering seamless integration and swift iterations. 🔧 Collaboration Tools: I am fluent in a range of collaboration tools, including Git, Slack, Trello, Jira, and Confluence. These tools enable efficient teamwork and project management, ensuring smooth communication and streamlined workflows. By leveraging my skills and expertise, you can expect high-quality code, seamless communication, and optimal performance for your web platforms. Let's collaborate and transform your ideas into extraordinary digital experiences! 🌟Apache Spark
GitLabLinuxMongoDBScalaTypeScriptAngularSpring BootDockerReactKubernetesRedisJavaScriptGoogle Cloud PlatformJava - $40 hourly
- 5.0/5
- (10 jobs)
As a Data Engineer 🔧, I transform complex data into clear, actionable insights 💡. My journey has taken me through collaborations with various companies, where I've tackled projects ranging from data modeling to crafting robust, scalable, and high-performance data pipelines. I've mastered the art of administrating data lake and data warehouse solutions and building comprehensive testing coverage and seamless CI/CD pipelines. If data engineering challenges are slowing you down, you've just found your solution. But don’t just take my word for it—here’s what my clients have to say about my work: ⭐⭐⭐⭐⭐ "Abderrahim completed this task very swiftly and efficiently. He performed all duties requested and went above and beyond by submitting the finished product before the deadline. I would highly recommend Abderrahim and I look forward to working with him again in the future!"Apache Spark
BigQueryData LakedbtApache AirflowData MiningData ExtractionData VisualizationData AnalysisSQLMachine LearningETL PipelinePythonGoogle Cloud PlatformData Engineering - $40 hourly
- 5.0/5
- (12 jobs)
Passionate Software Engineer and AI Practitioner with a strong background in full-stack development, data engineering, and applied machine learning. I specialize in designing reliable, scalable systems and turning abstract ideas into real, high-impact solutions. -Tech Stack: Python, JS/TS, Laravel, Odoo, MongoDB, MySQL -Focus Areas: Data Science · LLM Evaluation, AI Integration, System ArchitectureApache Spark
ClouderaCakePHPApache HBaseApache HadoopLaravelPythonPHPMongoDBJavaScript - $60 hourly
- 5.0/5
- (4 jobs)
Hello! I'm Abdelmalik, an experienced Quantitative Researcher and Machine Learning Scientist with expertise in leveraging advanced AI models to transform complex data into actionable insights. Here's what I bring to your projects: 🔹 Machine Learning & Deep Learning: • ✅ Expertise in TensorFlow, Keras, PyTorch, and IBM SPSS Modeler • ✅ Neural Network design with optimizations (SWISH, SMOTENC) • 📈 Dimensionality Reduction (PCA, t-SNE) • 🧠 GPU-accelerated training and inference (CUDA, CuPy) 🗣️ Natural Language Processing (NLP): • 📝 Advanced NLP models: BERT, Longformer, SpaCy, NLTK • 💬 Text classification, named entity recognition, lexical embedding, sentiment analysis, outlier detection • 🔥 Deployment of NLP models using Streamlit and FastAPI • 🗃️ Text vectorization (TF-IDF, Doc2Vec) • 🛠️ Data enrichment and preprocessing with Regex and NLTK 🎯 Deep Learning & Computer Vision: • 📸 Object detection and segmentation (OpenCV, Detectron2) • 📹 Real-time video processing with FFmpeg and OpenCV 📊 Quantitative Research & Financial Modeling: • 📉 Black-Scholes, Dupire, Heston, and Binomial option pricing models • 💹 Volatility modeling and calibration (Dupire, Heston) • 📊 Portfolio optimization (Markowitz, Momentum-based, Multi-periodic Binomial Trees) • 📊 Monte Carlo simulations, Value-at-Risk calculation, risk forecasting 🎓 Academic Excellence: • 🎖️ Certifications from Columbia, Yale, Stanford, Wharton, and New York Institute of Finance • 🎓 Specialized in Financial Engineering, Risk Management, and Machine Learning for Trading 💡 My goal: Turn complex data into strategic insights and powerful solutions tailored for your success. Let's connect to bring your projects to life! 🌟Apache Spark
ElasticsearchDockerData VisualizationData AnalysisArtificial IntelligenceApache HadoopConvolutional Neural NetworkStreamlitMachine LearningTensorFlowDeep LearningNatural Language ProcessingPyTorchComputer Vision - $30 hourly
- 4.9/5
- (10 jobs)
Software backend ★ DevOps ★ Database Performance tuning and recovery ★ CI/CD ★ Elasticsearch ★ Azure ★ Google Cloud ★ AWS ------------------------------------------------------------------------------------------- Build of containerized Backend APIs on cloud servers. Managing and deploying microservices applications using devops. Database performance tuning for better caching and latency response. Setting up CPU-Memory optimized clusters on cloud for data science applications. Optimization of compression, storage and access of big data using Apache Spark. ----------------------------------------------------------------- We provide Data Distribution, Data API delivery, Data analysis and Devops support services The technologies we use are the following: • Cloud providers: AWS, Azure, Google Cloud, DigitalOcean • Scripting Languages: Python, Bash/Shell Scripting, Java, * CI/CD integration: Azure DevOps, Travis CI, gitlab, jenkins, Terraform • Databases: MySQL, PostgreSQL, TimescaleDB • Libraries for Data Analysis: Pandas, Dask • Containerization: Docker, Podman, , Buildah, Docker-compose, Kubernetes, Helm * Observability: Prometheus, Grafana, Jaeger, Kibana • Big Data Frameworks: ElasticSearch, Apache Spark, Dask • ML/DL frameworks: Scikit-Learn, XGBoost, PyTorch, Keras, TensorFlow • Frameworks: DJANGO, Flask, FastAPI • Data Visualization: Dash, PLotly, Bokeh • API Delivery: REST API, JSON API • Automation : Selenium ----------------------------------------------------------------- DevOps Automation: We build fully automated continuous integration/continuous development build and release to production pipelines. ----------------------------------------------------------------- Data Distribution Services: We provide relevant data for your needs - no software, hardware or fetching skills need - we do the job as you request. ------------------------------ Data API requests: We build custom APIs for websites having a rate-limited or data-limited APIs so that you can use the data in your applications. ------------------------------ Data analysis: Gain powerful insight metrics with your data and improve performance indicators. ------------------------------ Machine Learning Solutions: We build machine learning models based the data we gather for you, so you can build a powerful decision-making process. -------------------------------------------------------------------------------------------Apache Spark
Microsoft AzureKubernetesAmazon Web ServicesElasticsearchGoogle Cloud PlatformDevOpsRedisDockerJenkinsCI/CDTerraformAmazon EC2MicroservicePython - $25 hourly
- 5.0/5
- (3 jobs)
🚀 Welcome to the Future of AI Integration! 🤖✨ As an Expert AI Integrator with a profound mastery of Data Science, Deep Learning, and the art of ChatGPT Fusion, I am dedicated to reshaping the realm of AI-driven innovation. 🌟 Backed by a rock-solid academic foundation and hands-on experience, I transform visionary concepts into tangible, real-world applications. 🔥 Igniting Growth Stories: I'm not just about buzzwords – I'm about driving real results. Picture this: I skyrocketed a Canadian enterprise, "Herbaly," from a modest $14,456 in daily sales to an impressive $22,543. How? By crafting and deploying a cutting-edge stock market prediction model infused with LSTM and powered by the sheer force of Apache Spark. Data analytics was the backbone, turning insights into tangible revenue growth. 🏥 Revolutionizing Healthcare Dynamics: Enter the era of MedGPT – an innovation that streamlined medical information accessibility for a hospital. Imagine a chatbot that not only engages in dynamic conversations but also processes medical images for seamless treatment. The result? Patient numbers surged, and the hospital's reputation soared to a stellar 4.1-star rating. I thrive on translating AI prowess into medical excellence. 🔐 Enhancing Security, Elevating Experience: Security meets AI in a groundbreaking system I devised for a security-focused company. By wielding AI classification models, I engineered a solution that ingeniously shuts down phones during simultaneous usage, revolutionizing user experience. The company's star rating skyrocketed from 3.7 to a dazzling 4.5 stars. I don't just integrate AI; I transform businesses. 🔍 My Centers of Excellence: Mastering Data Insights: Unveiling actionable insights from intricate datasets to empower informed decisions and enrich user interactions. Deep Learning Prowess: Harnessing the very essence of AI's potential, pushing boundaries with dynamic learning systems. Seamless Conversational AI: Infusing human-like dialogues with AI's intelligence through the ChatGPT integration. Technical Wizardry: A versatile skill set to flawlessly blend AI solutions within diverse technical landscapes. 💡 Tools & Gadgets I Command: 🔹 Data Analyst's Arsenal 🔹 ChatGPT API Sorcery 🔹 Deep Learning | Machine Learning Artistry 🔹 Django | React.js for web Development 🔹 Flutter and Kotlin | Integrating with TFlite model on Mobile App 🌟 Let's Craft the Future Together! Embark on this transformative journey alongside me. Together, we shall forge AI-powered innovations that rewrite the norms and create resounding impacts. The future is not just AI; it's a harmonious convergence of technology and human ingenuity. 🚀 Let's make remarkable history – are you ready? 💥 🌌 Shaping Possibilities, One AI at a TimeApache Spark
ChatGPT API IntegrationArtificial IntelligenceChatbotDockerReactData ScienceApache HadoopMongoDBDjangoMicrosoft Power BIApache CassandraDeep Learning ModelingMachine Learning ModelPython - $17 hourly
- 5.0/5
- (2 jobs)
Hello, I'm Zakaria, a seasoned data engineer with extensive experience in both data engineering and analysis. Over the years, I've actively contributed to numerous projects, assisting clients in maximizing the value of their data. I passionately engage in building various projects, exploring the latest technologies and trends in the field. Working with a diverse range of technologies, I've developed a strong foundation and cultivated a keen interest in discovering and adopting emerging trends. Now, I'm eager to share my knowledge with others and facilitate a smooth start for individuals entering the dynamic and evolving IT landscape. I got many certifications with different technologies such as AWS,AZURE, Snowflake in my years of experience I worked of different projects -building data lake for client on the cloud were I worked to build many data pipelines (python, pyspark, kafka) and many mechanisms (alert and monitoring) -build data analytics platform using snowflake and Azure -build Data platform on Snowflake and data for AI models -data migration from Snowflake to S3 -Data Pipelines on Airflow -monitoring system with Mongodb ,Flask ,VueJS,Docker I can help you take your business performance to the next level by providing the following services:: -Data Engineering -Data Analysis -Big Data Analysis -ETL PipelineApache Spark
Apache KafkaApache AirflowdbtSQLDockerData ScrapingFlaskMongoDBAmazon Web ServicesMicrosoft AzureSnowflakeArtificial IntelligenceBig DataPython - $50 hourly
- 4.8/5
- (7 jobs)
Highly skilled Engineer with several years of experience on multiple IT fields and technologies. I'm certified on Cloud computing (GCP, AZURE), Big Data technologies ( HADOOP, KAFKA, SPARK), Devops (JENKINS), Computer Languages (JAVA, PYTHON, SCALA ...) and Analytics & Insights (TABLEAU, POWER BI)Apache Spark
Databricks PlatformApache NiFiGoogleAzure Cosmos DBJenkinsMicrosoft Azure SQL DatabaseClouderaApache KafkaDevOpsBig DataTableauPythonJavaCloud Computing - $30 hourly
- 4.8/5
- (5 jobs)
I help clients, agencies, and startups develop their data pipelines, data lakes, and data warehouses. Additionally, I focus on optimizing the costs associated with their data solutions, orchestrating automated processes for rapidly growing businesses, and addressing any data-related challenges. My Services: ✅ Build and optimize ETL/ELT data pipelines ✅ Design and implement data lake and data warehouse architectures ✅ Data modeling, integration, and mapping (Star, Snowflake schema) ✅ Automate and Orchestrate Data pipelines using Apache Airflow and Dagster ✅ Write complex and optimized SQL queries ✅ Build APIs using FastAPI and Flask ✅ Streaming Data pipelines using Confluent Cloud (Kafka, Flink, Ksql, Schema Registry) ✅ Web Crawling and Scraping ✅ Data Platform Migration & Modernization ✅ Deploy Data Platforms (Spark, Kafka, Airflow, Dagster, NiFi) with Docker & Kubernetes ✅ Data pipeline emergency fix & Troubleshoot Technical Expertise: ► Cloud Platforms: AWS (Certified), GCP ► Data Warehousing: Snowflake, BigQuery ► Streaming: Apache Kafka, Apache Flink, Confluent Cloud ► Data Engineering: Spark, Apache Airflow, Apache NiFi, Dagster, DBT ► Backend Development: FastAPI, Flask, Express.js, Supabase(PostgreSQL) ► Data Visualization: Looker Studio, Streamlit, Plotly ► Languages: Python, SQL, Java My Approach: ► Free 30-minute consultation to understand your project goals ► Collaborative brainstorming to identify optimal solutions ► Clear project plan with defined scope, tools, and timeline ► Full technical documentation for easy handoff Let’s talk about how I can help simplify your data workflows or fix urgent pipeline issues. Send me a message and I’ll respond within a few minutes.Apache Spark
BigQuerySnowflakedbtLooker StudioAmazon Web ServicesApache FlinkData WarehousingApache AirflowApache NiFiApache Kafka - $35 hourly
- 0.0/5
- (0 jobs)
"When clients call, it's because they want the best. I deliver." I’m a Data Engineer, Data Architect, and Web Scraping Expert with a sharp eye for performance, structure, and clean execution. I don’t just write pipelines — I build data engines that move fast, scale well, and never compromise security or quality. Whether you need to: Design a solid data architecture from scratch, Build resilient ETL pipelines that scale with your business, Scrape millions of records without getting blocked, Or turn raw data into structured power — I’m the partner who gets it done, without excuses. 🧠 What I Bring to the Table ✔ Scalable Data Pipelines Built using Python, Apache Airflow, Spark, and SQL — for both batch and real-time needs. ✔ Smart Architecture I design with both structure and flexibility in mind: MongoDB, PostgreSQL, BigQuery, and hybrid setups. ✔ Elite Web Scraping Stealthy, scalable scrapers with Playwright, Puppeteer, Scrapy, and Selenium. I handle anti-bot protection, proxies, captchas — all the stuff that breaks cheap scrapers. ✔ Cloud & Automation Ready AWS, GCP, Docker, GitHub Actions — everything you need for clean deployment and smooth operation. 🧩 Skills Summary Languages: Python, SQL, JavaScript (Node.js), Bash Tools & Frameworks: Airflow, Spark, dbt, Kafka, FastAPI, Express Web Scraping: Playwright, Puppeteer, Scrapy, Selenium, BeautifulSoup Databases: MongoDB, PostgreSQL, MySQL, BigQuery, Firebase Cloud Platforms: AWS (Glue, Lambda, S3), GCP DevOps & Infra: Docker, Git, CI/CD, Terraform (if needed) Standards: GDPR / CNDP compliance, Secure Data Handling If you're looking for clean architecture, battle-tested pipelines, or stealth scrapers that never get caught, I'm ready. No micromanagement. No hand-holding. Just clean results.Apache Spark
Amazon S3Microsoft Azure AdministrationDatabricks PlatformWeb Scraping SoftwareSQL ProgrammingData Warehousing & ETL SoftwareETL PipelineData EngineeringMicrosoft Power BIPython - $30 hourly
- 5.0/5
- (1 job)
AREAS OF INTEREST Team spirit Sense of responsibility Autonomy SOCIAL LinkedIn: @abla-rahmouni-31596b185 I hold a Master's degree in Business Intelligence and Smart Vision (MIDVI) from Sidi Mohammed Ben Abdellah Fez University (USMBA). As a recent graduate, I am now actively seeking opportunities in the domain of data science and artificial intelligence to further apply and expand my expertise.Apache Spark
Product DevelopmentWeb ApplicationBig DataWeb DevelopmentApache KafkaApache Hadoop - $15 hourly
- 5.0/5
- (1 job)
🚀 Data Analyst & BI Consultant | 4+ Years Experience I specialize in transforming complex data into clear, actionable insights 📊. With expertise in Power BI, SQL, and advanced analytics tools, I create interactive dashboards, optimize data models, and automate reporting processes 🛠️. Whether you need to streamline your data pipeline or make smarter business decisions, I’m here to help you achieve your goals with precision and efficiency. Let’s unlock the full potential of your data together! 💡Apache Spark
Data EngineeringData LakeBusiness IntelligenceMicrosoft Power BI DevelopmentMicrosoft Power BI Data VisualizationMicrosoft Power BISQL Server Reporting ServicesSQL Server Integration ServicesMicrosoft SQL SSASMicrosoft SQL ServerData Analytics & Visualization SoftwareData Analysis - $27 hourly
- 4.4/5
- (2 jobs)
I am a cloud and data solutions architect with a solid foundation in cloud computing, big data engineering, and digital transformation. With several years of experience across industries like retail, logistics, aviation, public sector, and finance, I specialize in designing scalable, cloud-native architectures that help organizations modernize their infrastructure and accelerate innovation. I started my career in Big Data, working on end-to-end data pipelines and processing frameworks, before moving into data engineering and then solution architecture. This journey has given me a deep understanding of both the technical and strategic aspects of cloud transformation. I’ve led critical projects involving cloud migration, enterprise architecture design, smart city initiatives, and advanced analytics — often working directly with executive stakeholders and cross-functional teams. My core expertise lies in Google Cloud Platform (GCP), with additional experience on AWS and Azure. I’m also well-versed in creating presales strategies, technical proposals, and go-to-market plans. I bridge the gap between technical complexity and business value, ensuring that every solution I design is not only technically sound but also aligned with the client’s long-term goals. Key competencies: - Cloud Architecture (GCP, AWS, Azure) - Big Data & Data Engineering - Solution Architecture & Technical Presales - Smart Cities & Digital Twins - Retail & Logistics Optimization - Cloud Migration & Modernization - Executive Presentations & Client Engagement - Strategic Technology Planning I am passionate about building sustainable, high-impact systems that solve real business problems. I thrive in environments where innovation, strategy, and execution intersect — and I’m always looking for the next opportunity to contribute to meaningful transformation.Apache Spark
SalesforceTalend Data IntegrationAmazon Web ServicesMicrosoft AzureAmazon RedshiftGoogle Cloud PlatformUnix ShellSeleniumApache HadoopData EngineeringSQLApache AirflowJavaApache KafkaPythonScala - $25 hourly
- 5.0/5
- (2 jobs)
My role as a data analyst is to help you extract the stories and insights that are relevant and rich in meaning for your business. I strive to achieve this through a number of ways: ✅ Organize the datasets Nobody likes working in a messy environment. Why should it be different with data? Datasets can be huge and cleanliness and good organization are mandatory. ✅ Make datasets beautiful It should be easy to read the data. Consistent and clean formatting doesn't only ensure correct data structures, it also makes it easier to work with. ✅ Make the collection of data an easy process Today every business works with large datasets and the more it can get, the better. The collection of data can be a nightmare, though. Getting your datasets from the platforms and tools you use to run your business should be automatic, consistent and done as fast and as frequently as possible, instant ideally. ✅ Tell stories, don't just show charts It's part art, it's part technical knowledge. This is probably the highest challenge for any data analyst and data scientist. Telling the relevant stories, unearthing the interesting insights will always be the most interesting part of my job 🖤 My experience with data analytics spans over 3 years of working with Excel, PowerBI, Tableau ,Talend ,JasperReport ,SQL , DataBases . I'm also a Big data developper I use Python, Hadoop, Spark, Hive , MapReduce , Hue , Apache Nifi .Apache Spark
DatabaseDatabase QuerySQLData ScrapingData EntryTalend Data IntegrationMachine LearningMicrosoft Power BIStatisticsTableauData Analysis ExpressionsPythonBusiness IntelligenceMicrosoft Excel - $25 hourly
- 0.0/5
- (0 jobs)
Hello, my name is mohemd elmkohtar. I have been performing as a back-end web developer for 2 years and am still in the process of gaining more experiences through more years. My educational background possesses a bachelor’s degree in Software engineering. Some effective on-job training make me more skilled in this field to work in any circumstances. I am passionate of websites and back-end coding. I like to play with codes and results the best output from my work. I have mastered in web languages like the Python,java, HTML, CSS ,c++. I can solve problems analytically and can face any complex situation with a motive of fixing it with total vividness. Also, I handle customers with a friend-like manner keeping myself within professionalism. So, I guarantee my enthusiasm in work and fulfilling my project with inexpressible efforts. Try me and you don’t have to regret. Thank you.Apache Spark
Machine LearningNoSQL DatabaseBig DataApache HadoopQt FrameworkCpandasPostgreSQLPythonDjangoSQLMySQLJavaC++ - $23 hourly
- 5.0/5
- (3 jobs)
Greetings! 👋 I'm El Houcine Es Sanhaji, a seasoned Data and Software Engineer with a passion for turning complex concepts into seamless, functional realities. With 4 years in the industry, I bring a wealth of expertise to the table, offering top-notch solutions tailored to your unique needs. 🌐 Why Choose Me? ✨ Data Alchemist: I specialize in turning raw data into actionable insights. Whether it's data analysis, processing, or visualization, I thrive on transforming information into strategic assets for your business. 💡 Innovative Coding Architect: Crafting robust and scalable software is my forte. From conceptualization to deployment, I architect solutions that not only meet but exceed expectations, ensuring a smooth digital journey. 🛠️ Tech Toolbox: Proficient in a myriad of technologies including Python, Spark, SQL, NoSQL, OCI, and GCP, I bring a versatile skill set to tackle any engineering challenge that comes my way. 🤝 Collaborative Approach: Your vision is my mission. I believe in collaborative partnerships, working closely with clients to understand their objectives and delivering results that elevate their digital presence. 🚀 What I Offer: 📊 Data Engineering: Unlock the power of your data with advanced processing, integration, and analysis. 💻 Software Development: From MVPs to full-scale applications, I code solutions that align with your business goals. 🔄 Maintenance & Optimization: Ensuring your systems run smoothly with regular check-ups and performance enhancements. Let's embark on a journey of innovation together! If you're ready to take your data and software projects to new heights, let's connect. I'm here to turn your ideas into digital brilliance. 💡✨Apache Spark
Data Warehousing & ETL SoftwareFlaskFastAPIGoogle Cloud PlatformRecommendation SystemDashboardOracle CloudApache AirflowData EngineeringData ManagementMachine LearningDockerOracle DatabasePython - $35 hourly
- 0.0/5
- (0 jobs)
Experienced Data & AI Engineer proficient in designing and implementing data solutions and AI applications. Skilled in building end-to-end data pipelines and developing advanced AI models for efficient data retrieval. Committed to delivering scalable, high-quality solutions and transforming business requirements into robust technical implementations.Apache Spark
Business IntelligenceDockerApache HadoopGoogle Cloud PlatformMicrosoft AzureDeep Learning ModelingWeb Scraping FrameworkData Analytics & Visualization SoftwareMachine Learning ModelBig DataLLM Prompt EngineeringJavaSQLPython - $50 hourly
- 0.0/5
- (0 jobs)
Data Engineer with over 6 years of experience, specialized in data processing and analysis for leading companies worldwide. Strong expertise in Data Lake projects and a deep understanding of the end-to-end data value chain with both on-premise and cloud based architectures. **GCP professional Data Engineer Certified **Apache Spark
Data WarehousingData VisualizationPythonScalaGoogle Cloud PlatformData LakeBig DataClouderaBigQueryApache HadoopApache HiveElasticsearchApache KafkaETL Pipeline - $20 hourly
- 0.0/5
- (0 jobs)
I am a highly motivated backend developer passionate about building scalable, efficient applications using Spring Boot and microservices architecture. With a solid grounding in distributed systems and DevOps practices, I thrive on tackling backend complexities with clean, reliable solutions. I am also deeply interested in leveraging big data tools and cloud-native technologies to optimize performance and scalability. ________________________________________ Technical Skills • Backend Development & APIs o Spring Boot: Proficient in developing RESTful APIs and microservices with Spring Boot, focusing on modularity, performance, and security. o Microservices Architecture: Skilled in designing distributed, fault-tolerant systems. Familiar with advanced techniques in service-to-service communication, such as gRPC, GraphQL and event-driven patterns using Kafka. o Java & JEE: Extensive experience with Java and Java Enterprise Edition, with a strong understanding of Object-Oriented Programming (OOP) principles, design patterns, and best practices. • DevOps & Containerization o Docker & Kubernetes: Proficient in containerizing applications for consistent environments across stages and deploying them in Kubernetes clusters for scalability and resilience. o Version Control & Collaboration: Advanced knowledge of Git for version control, managing codebases, and collaborating effectively within agile teams. • Data Engineering & Big Data Ecosystem o Kafka & Airflow: Skilled in setting up real-time data streaming pipelines with Kafka, ensuring high-throughput, fault-tolerant data ingestion. Experienced with Airflow for orchestrating complex workflows in data pipelines. o Data Storage: Proficient with HDFS for distributed storage, Cassandra for scalable database management, and familiar with Spark for large-scale data processing tasks. o ETL Processes: Can build end-to-end ETL pipelines that transform, clean, and load data across different platforms. • Frontend Technologies o Angular: Knowledge of component-based frameworks like Angular enhances my ability to deliver seamless, full-stack solutions when needed. • Database Management o Relational & NoSQL Databases: Skilled with PostgreSQL and MySQL for structured data, and Cassandra and MongoDB for NoSQL solutions. This flexibility supports efficient data storage and retrieval across different project needs.Apache Spark
PostgreSQLMySQLApache CassandraApache KafkaApache AirflowAngularCI/CDKubernetesDockerETL PipelineAWS ApplicationSpring Boot - $30 hourly
- 0.0/5
- (2 jobs)
I am an experienced DevOps Engineer and System Administrator with over two years of hands-on experience in managing, automating, and optimizing infrastructure for high-availability systems. My expertise spans across cloud platforms, containerization (Docker, Kubernetes), CI/CD pipelines, and scripting in Python, Bash, and PowerShell. Key Skills: DevOps & System Administration: Managed large-scale infrastructures, automated backups, and deployed robust environments using cloud services like AWS (S3) and on-premises servers. CI/CD Pipeline Development: Created end-to-end CI/CD pipelines using GitLab, Jenkins, and Docker, automating testing, building, and deployment processes to ensure seamless integration. Containerization: Extensive experience in Docker, Kubernetes, and LXC for efficient container orchestration and management, ensuring optimized, secure, and scalable application deployments. Infrastructure Automation: Leveraged Terraform, Ansible, and Systemd for automating infrastructure and creating robust, repeatable environments. Web & Database Technologies: Experience with Node.js, Java, Python, Django, and databases like MongoDB and PostgreSQL. Sample Projects: Developed an automated backup system with S3 integration using Python and Bash, complete with cron jobs and systemd services. Configured and managed CI/CD pipelines for multiple web applications, reducing deployment time by 50% while ensuring quality with automated testing. Implemented Docker and Kubernetes solutions to streamline application deployment and management, enhancing scalability and reducing downtime. With a solid foundation in both development and operations, I’m focused on providing solutions that are both highly efficient and cost-effective. If you are looking for a reliable freelancer to assist with system automation, cloud deployments, or full-stack development, feel free to reach out. I’m ready to bring my expertise to your project and help take it to the next level!Apache Spark
ETLJupyterLabApache HadoopHiveAzure DevOpsTerraformZabbixELK StackPrometheusPythonKubernetesDockerLXDLinux - $12 hourly
- 0.0/5
- (0 jobs)
About Me: Hello! I'm Adnane, and I'm passionate about Data Analysis. With 4 years of experience in this industry, I've honed my skills to deliver top-notch results for my clients. Why Choose Me: ✅ Expertise: I specialize in Power BI Dev, Power BI visualisation, and Working with the Data, allowing me to provide tailored solutions that meet your specific needs. ✅ Quality: I'm committed to delivering high-quality work that exceeds expectations. Attention to detail and a dedication to excellence are the cornerstones of my work ethic. ✅ Communication: Clear and prompt communication is essential in any successful project. I'm responsive and proactive, ensuring that you're always informed about the project's progress. ✅ Client-Centric Approach: I believe in a client-centric approach, which means your satisfaction is my priority. I'm here to listen to your ideas, address your concerns, and collaborate closely to achieve your goals. My Services: I offer a range of services, including the creation of a Power BI dashboard, Connecting sources of Data to Power BI, and Cleaning Data. Whether you're looking for a complete project or need assistance with a specific task, I've got you covered. Let's Get Started: If you're ready to take your project to the next level, I'm here to help. Feel free to reach out, and let's discuss how I can assist you in achieving your objectives. I look forward to working with you and delivering results that make a difference.Apache Spark
TableauApache AirflowPySparkMicrosoft Power BI Data VisualizationMicrosoft Power BI DevelopmentMicrosoft Power BISqoopApache ImpalaApache HiveApache HadoopBig DataPythonSQL - $15 hourly
- 0.0/5
- (0 jobs)
Hello ;) I am a Software and data engineer with strong enthusiam for AI and two years of experience in developing web applications & AI solutions. I have done multiple projects that integrate AI models into web applications to do some magic AI stuff. Here are all the technical skills that I have developed during my 4 years of programming: Programming languages: Python, JavaScript and Java. Web Development toolkit: Django, Flask, FastAPI, SpringBoot, MySQL, HTML, CSS, BootStrap,, Docker, Git. Cloud Providers: Microsoft Azure, Oracle Cloud Infrastructure OCI Data Engineering toolkit: Apache Spark, Apache Kafka, Apache Airflow. Machine Learning toolkit: Scikit-learn, OpenCV, TensorFlow, Pytorch. Web Scraping: BeautifulSoup, requests, Selenium, scrapy. I am open to work and I can guarentee you that QUALITY MATTERS :)Apache Spark
LLM Prompt EngineeringMicrosoft AzureData EngineeringApache AirflowApache KafkaData ScienceData MiningWeb DevelopmentDeep LearningMachine LearningWeb ScrapingJavaScriptFastAPIPython - $15 hourly
- 0.0/5
- (0 jobs)
Data Analyst & AgTech Specialist | 3x Microsoft Certified Helping farms & agri-startups grow smarter using AI, ML, and data-driven solutions 🌾💡 🌐 Visit my portfolio: faissalmouflla.comApache Spark
DockerMicrosoft Power BIKibanaTalend Data IntegrationElasticsearchApache CassandraMongoDBApache AirflowApache HBaseApache KafkaApache HadoopMicrosoft AzureSQLPython - $8 hourly
- 0.0/5
- (0 jobs)
Greetings! I'm Brahim, a passionate Data Engineer and Data Analyst from Morocco. I specialize in ETL, data warehousing, and transforming raw data into actionable insights. With experience working with various Fortune companies, I excel in Python, Scala, SQL, NoSQL, Apache Spark, Apache Hadoop, GCP, and Power BI. I prioritize a customer-first approach, ensuring satisfaction with creative, quality-driven outcomes. Looking for an expert who's always available? Let's connect and bring your tech vision to life!Apache Spark
ScalaMicrosoft Power BI Data VisualizationBig DataApache KafkaGoogle Cloud PlatformSQLPythonApache HadoopBigQuerydbtApache Airflow - $17 hourly
- 0.0/5
- (0 jobs)
I’m a data engineer with extensive experience in Big Data Analytics and Artificial Intelligence. My expertise lies in building and managing robust data pipelines, developing advanced machine learning models, and implementing real-time data solutions to drive actionable insights and strategic decision-making. Whether you need to handle large datasets, create predictive models, or integrate live data streams, I can help. - Proficient in Apache Spark and Apache Kafka for real-time data processing and streaming. Skilled in Hadoop for distributed data storage and processing. - Expertise in Python for data manipulation, analysis, and machine learning. Proficient in SQL for database querying and management. - Experienced in developing and deploying machine learning models using libraries such as TensorFlow, Scikit-learn. - Proficient in using Django for developing user-friendly web interfaces that facilitate seamless interaction with complex data systems. - Comprehensive project management from initial concept to deployment. Experienced in agile methodologies and version control using Git. - Regular communication is key to successful collaboration, so let’s stay connected I am passionate about leveraging data to solve complex problems and drive innovation. Let's work together to transform your data into valuable insights and solutions.Apache Spark
DjangoJavaScalaDockerTalend Data IntegrationMicrosoft Power BIPyTorchPython Scikit-LearnApache KafkaData EngineeringBig DataData Science Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.