Hire the best Apache NiFi developers

Check out Apache NiFi developers with the skills you need for your next job.
  • $95 hourly
    => Let's Connect Hello, I'm Dima, a seasoned CyberSecurity Specialist and Turnkey Infrastructure Expert specializing in BigData solutions and data analysis, utilizing a DevOps approach. => Expertise Overview With a robust passion for constructing SOC, SOAR, and SIEM solutions, my primary focus lies in developing data ingestion, enrichment, and analysis pipelines, ensuring they are highly available and fault-tolerant. My expertise extends to building central logging and real-time processing platforms from the ground up, optimizing them for performance, security, and reliability across multiple environments, whether in the cloud or on-premise. => Value Proposition My commitment is to deliver solutions that not only centralize security and threat intelligence but also facilitate enhanced control over data, ultimately contributing to infrastructure cost savings. => Technological Summary CyberSecurity:------- > Wazuh, Suricata, pfSense BigData:--------------- > Kafka, ElasticSearch, OpenSearch Data Processing:----- > FluentD, Vector.dev, Apache NiFi Infra as Code:--------- > Terraform, cdktf, cdk8s Virtualization:--------- > Proxmox, VMware Containerization:----- > Kubernetes Clouds:---------------- > AWS, Hetzner, DigitalOcean, Linode Automation:----------- > Jenkins, GitHub Actions Monitoring:----------- > Zabbix, Grafana, Kibana, Prometheus, Thanos Mail:--------------------> MailCow SMTP/IMAP, Postfix VPN:------------------- > OpenVPN Server Programming:-------- > Bash, Python, TypeScript Operating Systems:- > CentOS, RHEL, Rocky Linux, Ubuntu, Debian => Personal Attributes • Leadership: Leading by example with a team-first approach • End-to-End Execution: Proficient from POC to Enterprise-level implementation • Resilience: Demonstrating high thoroughness and endurance • Adaptability: A quick, can-do architect and experienced troubleshooter • Optimization: Adept in process and performance optimization • Documentation: Skilled technical documentation writer • Vision: A visionary in technological implementation and solution provision
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Elasticsearch
    Linux System Administration
    Apache Kafka
    Apache Hadoop
    Email Security
    Machine Learning
    ELK Stack
    Cloudera
    Zabbix
    MySQL
    Big Data
    PfSense
    Red Hat Administration
    Proxmox VE
    Amazon Web Services
  • $30 hourly
    Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data Catalog
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    SQL
    AWS Glue
    PySpark
    Apache Cassandra
    ETL Pipeline
    Apache Hive
    Apache Kafka
    Big Data
    Apache Hadoop
    Scala
    Apache Spark
  • $175 hourly
    Mr. Joshua B. Seagroves is a seasoned professional having served as an Enterprise Architect/Senior Data Engineer for multiple Fortune 100 Companies. With a successful track record as a startup founder and CTO, Mr. Seagroves brings a wealth of experience to his role, specializing in the strategic design, development, and implementation of advanced technology systems. Throughout his career, Mr. Seagroves has demonstrated expertise in architecting and delivering cutting-edge solutions, particularly in the realm of data engineering and sciences. He has successfully spearheaded the implementation of multiple such systems and applications for a diverse range of clients. As part of his current responsibilities, Mr. Seagroves actively contributes to the prototyping and research efforts in the field of data engineering/data science, specifically in the development of operational systems for critical mission systems. Leveraging his extensive background in architecture and software modeling methodologies, he has consistently led and collaborated with multidisciplinary teams, successfully integrating various distributed computing technologies, including Hadoop, NiFi, HBase, Accumulo, and MongoDB. Mr. Seagroves' exceptional professional achievements and extensive experience make him a highly sought-after expert in his field. His comprehensive knowledge and hands-on expertise in advanced technology systems and big data make him a valuable asset to any organization.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    YARN
    Apache Hadoop
    Big Data
    Apache Zookeeper
    TensorFlow
    Apache Spark
    Apache Kafka
    Artificial Neural Network
    Artificial Intelligence
  • $85 hourly
    FHIR, HL7v2, HL7v3, C-CDA, Carequality HIE, Mirth Connect, Apache NiFi, MU, EDI, EHR/EMR Functional Model. Back-end – JavaScript, Java, SQL, SmileCDR, Aidbox. • Certified HL7 FHIR R4 Proficiency • Certified HL7v2 Control Specialist • Certified HL7 CDA Specialist • Certified HL7v3 RIM Specialist • IHE Certified Professional - Foundations --- FHIR --- * Mirth Connect based FHIR integration, FHIR Server on Mirth (HAPI FHIR library). * FHIR (RESTful, event-based messaging and documents paradigms, profiling with Forge). * HL7v2 to/from FHIR mapping (e.g., ADT, ORU, OML message types). * C-CDA Level 1, Level 3 to/from FHIR mapping. * FHIR tools (Touchstone, Simplifier, Smile CDR, Forge). * Canadian Core/Baseline FHIR profiles editor. * IG Publishing (IG Publisher, FSH - FHIR Shorthand, SUSHI). * Apache NiFi custom FHIR processors. --- CMS Compliance --- * US Core profiles / IPS profiles / CA Baseline profiles * CARIN Blue Button / CMS Blue Button 2.0 * Da Vinci PDEX Plan Net * Da Vinci PDEX US Drug Formulary * Da Vinci Payer Data Exchange (ePDx) --- HL7 --- * Mirth Connect based HL7v2/HL7v3 integration. * Apache NiFi custom HL7v2 processors * HL7v2 conformance profiles (documentation quality level 3). * Refined and constrained versions of HL7v3 interactions based on Visio models. * HL7v3 messaging (Batch wrapper, Query Infrastructure; Claims, Lab, Patient Administration, Personnel Management domains). * Conformance testing of HL7v2.x/HL7v3 interactions implementation. * Development of HL7v2.x and HL7v3 specifications and Implementation Guides using Messaging Workbench, RMIM Designer, V3Generator. * Canadian HIAL, OLIS, HRM interfaces. --- C-CDA (Consolidated CDA) --- * CDA parsing library for Mirth Connect. * Document templates (e.g., CCD, NHSN CDA). * Mirth Connect based C-CDA document templates implementation and transformation. * Development of CDA templates specifications (Level 2, Level 3 section and entry templates). * CDA document templates modeling (MDHT Modeling or ART-DECOR). * Conformance testing of C-CDA documents. --- EHR / EMR / PHR --- * Software development of EMR solutions (using Mirth Connect, Java, JavaScript, XML Schema, XSLT, Schematron). * HL7 EHR System Functional Model and Profiles (e.g., Meaningful Use Functional Profile for ONC/NIST Test Procedures, HL7 PHR System FM). --- IHE ITI Profiles --- * Carequality HIE * OpenHIE * IHE profiles specifications and development: XDS, XDS.b, XDS-I.b, XCA, XCPD, MPQ, DSUB, XDM. * IHE HL7v3 profiles: PIXv3, PDQv3. * IHE FHIR profiles: MHD, PIXm, NPFSm, PDQm, mRFD. * Audit and security domains: ATNA, BPPC, IUA, XUA. Experience with: SmileCDR, Carequality, Quest, LabCorp, AllScripts, eClinicalWorks (eCW), CRISP, MUSE, OpenHIE, etc.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    API Development
    Electronic Data Interchange
    FHIR
    Java
    Mirth Connect
    Health Level 7
    Electronic Medical Record
    HIPAA
    ECMAScript for XML
    XSLT
    XML
    API Integration
    JavaScript
  • $50 hourly
    "She is very good in coding. She is the best and to go person for any hadoop or nifi requirements." "Abha is a star; have successfully handed the project in a very professional manner. I will definitely be working with Abha again; I am very happy with the quality of the work. 🙏" "Abha Kabra is one of the most talented programmers I have ever meet in Upwork. Her communication was top-notch, she met all deadlines, a skilled developer and super fast on any task was given to her. Perfect work is done. Would re-hire and highly recommended!!" Highly skilled and experienced Bigdata engineer with over 5 years of experience in the field. With a strong background in Analysis, Design, and Development of Big Data and Hadoop based Projects using technologies like following: ✅ Apache spark with Scala & python ✅ Apache NiFi ✅ Apache Kafka ✅ Apache Airflow ✅ ElasticSearch ✅ Logstash ✅ Kibana ✅ Mongodb ✅ Grafana ✅ Azure data factory ✅ Azure pipelines ✅ Azure databricks ✅ AWS EMR ✅ AWS S3 ✅ AWS Glue ✅ AWS Lambda ✅ GCP ✅ cloud functions ✅ PostgreSql ✅ MySql ✅ Oracle ✅ MongoDB ✅ Ansible ✅ Terraform ✅ Logo/Book Cover Design ✅ Technical Blog writing A proven track record of delivering high-quality work that meets or exceeds client expectations. Deep understanding of Energy-Related data, IoT devices, Hospitality industry, Retail Market, Ad-tech, Data encryptions-related projects, and has worked with a wide range of clients, from Marriott, P&G, Vodafone UK, eXate UK etc. Able to quickly understand client requirements and develop tailored solutions that address their unique needs. Very communicative and responsive, ensuring that clients are kept informed every step of the way. A quick learner and is always eager to explore new technologies and techniques to better serve clients. Familiar with Agile Methodology, Active participation in Daily Scrum meetings, Sprint meetings, and retrospective meetings, know about working in all the phases of the project life cycle. A strong team player and a leader with good interpersonal and communication skills and ready to take independent challenges.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    PySpark
    Databricks Platform
    ETL Pipeline
    Big Data
    Grafana
    Kibana
    Apache Kafka
    Apache Spark
    PostgreSQL
    Microsoft Azure
    MongoDB
    Scala
    Python
    Elasticsearch
    Google Cloud Platform
    Amazon Web Services
  • $18 hourly
    Who am I: More than 20 years of mastering and universal knowledge gives me the ability to combine a wide range of technology, make complicated heterogeneous systems and automate business processes. Last 5 years I have worked with Alfresco Content Services. My greatest value is universality. I deeply understand technology starting from the electron crossing Fermi level in semiconductors to the business process automation in the organizational structure of large companies. That's why, I can find out almost any integration solution. What we can do: We can make a deployment of Alfresco Content Services in test, development and production environments. Upgrade and migrate it from previous versions. Create a backup and disaster recovery plans. We can integrate it in the user environment, synchronise users with a centralised authentication management system, make SSO login, choose document access, editing, OCR technologies e.t.c. Integrate Alfresco in your corporate ecosystem, applications, api gateways, databases. We can make customer data models, add document classifications, and additional metadata. Create a business process automation of document management. We can create a production environment for any application with Docker/Kubernetes, a development environment with version control and automated CI/CD pipelines with, for exampl, on-promise Gitlab or in the GCP. Short list of base technologies: - Docker, Docker Compose, Kubernetes... - Linux, Debian, Bash, Python... - Git, Gitlab, CI/CD, DevOps... - Nginx, proxy, DNS, SMTP... - SSL/TLS, Kerberos, SSO, SALM... - Java, Javascript, SQL, PostgreSQL, CMIS... - Apache, Tomcat, NiFi, Elasticsearch/Kibana, WSO2, QGIS... - Google Cloud Platform, any cloud and hosting solutions....
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Docker
    Kubernetes
    JavaScript
    Docker Compose
    Elasticsearch
    Apache Tomcat
    Alfresco Content Services
    Kerberos
    SSL
    Java
    Linux System Administration
    Linux
    Google Cloud Platform
  • $20 hourly
    Experienced Software Engineer with a demonstrated history of working in the information technology and services industry. Skilled in Java, Spring Boot, DevOps, Jenkins, Ansible Eureka, React, and groovy
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Docker
    Linux
    Apache Spark MLlib
    DevOps
    Ansible
    Apache Hadoop
    Big Data
    Apache Spark
    Elasticsearch
    Python
    Cloud Computing
    JavaScript
    Java
  • $70 hourly
    🎓 𝗖𝗲𝗿𝘁𝗶𝗳𝗶𝗲𝗱 𝗗𝗮𝘁𝗮 𝗣𝗿𝗼𝗳𝗲𝘀𝘀𝗶𝗼𝗻𝗮𝗹 with 𝟲+ 𝘆𝗲𝗮𝗿𝘀 of experience and hands-on expertise in Designing and Implementing Data Solutions. 🔥 4+ Startup Tech Partnerships ⭐️ 100% Job Success Score 🏆 In the top 3% of all Upwork freelancers with Top Rated Plus 🏆 ✅ Excellent communication skills and fluent English If you’re reading my profile, you’ve got a challenge you need to solve and you are looking for someone with a broad skill set, minimal oversight and ownership mentality, then I’m your go-to expert. 📞 Connect with me today and let's discuss how we can turn your ideas into reality with creative and strategic partnership.📞 ⚡️Invite me to your job on Upwork to schedule a complimentary consultation call to discuss in detail the value and strength I can bring to your business, and how we can create a tailored solution for your exact needs. 𝙄 𝙝𝙖𝙫𝙚 𝙚𝙭𝙥𝙚𝙧𝙞𝙚𝙣𝙘𝙚 𝙞𝙣 𝙩𝙝𝙚 𝙛𝙤𝙡𝙡𝙤𝙬𝙞𝙣𝙜 𝙖𝙧𝙚𝙖𝙨, 𝙩𝙤𝙤𝙡𝙨 𝙖𝙣𝙙 𝙩𝙚𝙘𝙝𝙣𝙤𝙡𝙤𝙜𝙞𝙚𝙨: ► BIG DATA & DATA ENGINEERING Apache Spark, Hadoop, MapReduce, YARN, Pig, Hive, Kudu, HBase, Impala, Delta Lake, Oozie, NiFi, Kafka, Airflow, Kylin, Druid, Flink, Presto, Drill, Phoenix, Ambari, Ranger, Cloudera Manager, Zookeeper, Spark-Streaming, Streamsets, Snowflake ► CLOUD AWS -- EC2, S3, RDS, EMR, Redshift, Lambda, VPC, DynamoDB, Athena, Kinesis, Glue GCP -- BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Data Fusion Azure -- Data Factory, Synapse. HDInsight ► ANALYTICS, BI & DATA VISUALIZATION Tableau, Power BI, SSAS, SSMS, Superset, Grafana, Looker ► DATABASE SQL, NoSQL, Oracle, SQL Server, MySQL, PostgreSQL, MongoDB, PL/SQL, HBase, Cassandra ► OTHER SKILLS & TOOLS Docker, Kubernetes, Ansible, Pentaho, Python, Scala, Java, C, C++, C# 𝙒𝙝𝙚𝙣 𝙮𝙤𝙪 𝙝𝙞𝙧𝙚 𝙢𝙚, 𝙮𝙤𝙪 𝙘𝙖𝙣 𝙚𝙭𝙥𝙚𝙘𝙩: 🔸 Outstanding results and service 🔸 High-quality output on time, every time 🔸 Strong communication 🔸 Regular & ongoing updates Your complete satisfaction is what I aim for, so the job is not complete until you are satisfied! Whether you are a 𝗦𝘁𝗮𝗿𝘁𝘂𝗽, 𝗘𝘀𝘁𝗮𝗯𝗹𝗶𝘀𝗵𝗲𝗱 𝗕𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗼𝗿 𝗹𝗼𝗼𝗸𝗶𝗻𝗴 𝗳𝗼𝗿 your next 𝗠𝗩𝗣, you will get 𝗛𝗶𝗴𝗵-𝗤𝘂𝗮𝗹𝗶𝘁𝘆 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀 at an 𝗔𝗳𝗳𝗼𝗿𝗱𝗮𝗯𝗹𝗲 𝗖𝗼𝘀𝘁, 𝗚𝘂𝗮𝗿𝗮𝗻𝘁𝗲𝗲𝗱. I hope you become one of my many happy clients. Reach out by inviting me to your project. I look forward to it! All the best, Anas ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Muhammad is really great with AWS services and knows how to optimize each so that it runs at peak performance while also minimizing costs. Highly recommended! ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ You would be silly not to hire Anas, he is fantastic at data visualizations and data transformation. ❞ 🗣❝ Incredibly talented data architect, the results thus far have exceeded our expectations and we will continue to use Anas for our data projects. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ The skills and expertise of Anas exceeded my expectations. The job was delivered ahead of schedule. He was enthusiastic and professional and went the extra mile to make sure the job was completed to our liking with the tech that we were already using. I enjoyed working with him and will be reaching out for any additional help in the future. I would definitely recommend Anas as an expert resource. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Muhammad was a great resource and did more than expected! I loved his communication skills and always kept me up to date. I would definitely rehire again. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Anas is simply the best person I have ever come across. Apart from being an exceptional tech genius, he is a man of utmost stature. We blasted off with our startup, high on dreams and code. We were mere steps from the MVP. Then, pandemic crash. Team bailed, funding dried up. Me and my partner were stranded and dread gnawed at us. A hefty chunk of cash, Anas and his team's livelihood, hung in the balance, It felt like a betrayal. We scheduled a meeting with Anas to let him know we were quitting and request to repay him gradually over a year, he heard us out. Then, something magical happened. A smile. "Forget it," he said, not a flicker of doubt in his voice. "The project matters. Let's make it happen!" We were floored. This guy, owed a small fortune, just waved it away? Not only that, he offered to keep building, even pulled his team in to replace our vanished crew. As he spoke, his passion was a spark that reignited us. He believed. In us. In our dream. In what he had developed so far. That's the day Anas became our partner. Not just a contractor, but a brother in arms. Our success story owes its spark not to our own leap of faith, but from the guy who had every reason to walk away. Thanks, Anas, for believing when we couldn't.❞
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Solution Architecture Consultation
    AWS Lambda
    ETL Pipeline
    Data Management
    Data Warehousing
    AWS Glue
    Apache Spark
    Amazon Redshift
    ETL
    Python
    SQL
    Marketing Analytics
    Big Data
    Data Visualization
    Artificial Intelligence
  • $75 hourly
    As a freelancer and data engineer, I concentrated on databases and ETL related projects, queries performance, and database structure optimization. Work with many types of databases for more than 15 years, primarily with PostgreSQL, MySQL, MS SQL, Redshift, but on projects work with many others, such as Snowflake, Cloudera, DB2, Oracle. Big portfolio of ETL projects with Talend Data Integration, NiFi. Certified Talend Developer (Data Integration, BigData) Microsoft Certified Professional Continuously extend my expertise and knowledge. Open for new challenges.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Snowflake
    SQL Programming
    Amazon Redshift
    Data Analysis
    Microsoft SQL Server Programming
    SQL
    Database Design
    Data Migration
    PostgreSQL
    Database Administration
    ETL
    MySQL
    Talend Open Studio
  • $175 hourly
    As a skilled data engineer, developer, and biomedical engineer, I have spent the last several years honing my expertise to offer top-tier services to clients. My speciality is in data engineering, including data warehousing, ETL optimization, data modeling, data governance, and data visualization. I also have a strong background in machine learning and MLOps. This has allowed me to develop a command of a wide range of technologies, including Hadoop, DataBricks, Docker, Terraform, Apache Spark, Collibra, big data, business intelligence tools, cloud computing (AWS, GCP, and Azure), TensorFlow, Keras, and Kubernetes. My preferred languages are Python, R, SQL, Rust, and Java, but I am also comfortable using other programming languages. I have included examples of recent projects in my portfolio to showcase my capabilities. In addition, I have developed a strong blockchain development skill set, including proficiency in JavaScript, Rust, and Solidity, using Truffle, Hardhat, and Alchemy. I specialize in smart contract development and auditing, and I am confident in my abilities to help clients develop and deploy effective blockchain solutions. As a medical device specialist, I have a wealth of experience in regulatory and quality control, as well as all areas of the product life cycle. I have worked on class 1, class 2, and class 3 devices in several areas, including software applications, dermatological, diagnostic biotech, diagnostic radiology, orthopedic, cardiac, urology, and physical therapy devices. I am skilled in completing regulatory documents such as design history folders, risk assessments, 510(k), DIOVV, quality plans, and verification/validation testing, and have included examples of these documents and public policy papers I have written in my portfolio. I offer services in data engineering, machine learning, medical device and overall software development, and I am confident that my skills and experience can deliver results that exceed your expectations. Please feel free to contact me to discuss your project and how I can help you achieve your goals. Let's connect and make your project a success.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Dashboard
    Data Visualization
    Data Mining
    ETL
    API Development
    Compliance Consultation
    Rust
    HIPAA
    Solidity
    Medical Device
    Blockchain
    Data Engineering
    Machine Learning
    Data Science
    SQL
  • $200 hourly
    Big Data, Machine learning, and Data Science is my passion. In that field, I have focused mostly on NLP, NLU, OCR / handwriting recognition. I love to teach and train people in the area, and how it can be used.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Database
    SQLAlchemy
    Apache Hadoop
    Big Data
    SQL Server Integration Services
    SQL Server Reporting Services
    Microsoft SQL Server
    SQL
    Machine Learning
    Apache Spark
  • $35 hourly
    I am a data engineer expert with over than 5 years experience in data ingestion, integration and manipulation. Till date, I have done many projects in data engineering and big data. I worked on business analytics and telco analytics, i used multy data platforms and framework such as Cloudera data platform, Nifi, R studio, Spark, Hadoop, Kafka ... If this is what you want, then get in touch with me
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Cloud Engineering
    Cloudera
    Apache Hadoop
    Data Warehousing
    Linux
    Apache Spark
    Data Lake
    Data Analysis
    SQL
    Big Data
    Business Intelligence
    Scala
    Apache Hive
    Python
  • $30 hourly
    ✋ Hi, I am an experienced Data Engineer with 9+ years of experience. I have developed a lot of Big Data Applications for analysis and real-time analytics with optimized performance. I have been involved in all Data related development e.g Data Warehousing, Data Engineering, Big Data, Data Integration from different sources, Business Intelligence and in the field of Data Science. 🎓 I have a Bachelor of Science in Computer Science (BSCS). My core competency lies in complete end to end management of a new projects and I also have keen understanding of business trends that I discuss with my clients as suggestions and most of times they take it and it really add a new level in their delivery. I am committed to provide best services at the most economic cost and in turn get a satisfied list of loyal clients.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Microsoft Azure
    Amazon Web Services
    MySQL Programming
    Bash Programming
    Big Data
    Data Science
    Amazon S3
    Database
    PySpark
    Database Design
    Apache Spark
    Python
    SQL
  • $110 hourly
    Top-rated developer working (mostly) with big data, artificial intelligence, machine learning, analytics & back-end architecture. I am specialized in Bigdata (Hadoop, Apache Spark, Sqoop, Flume, Hive, Pig, Scala, Apache Kudu, kafka, python, shell scripting core Java, Machine Learning). As a Big Data architect I work as part of a team responsible for building, designing application for online analytics, Outgoing, motivated team player eager to contribute dynamic customer service, administrative, supervisory, team building, and organizational skills towards supporting the objectives of an organization that rewards reliability, dedication, and solid work ethics with opportunities for professional growth. skillSet: Hadoop,spark, scala, python, bash, Tableau,jenkins, Ansible,Hbase, Sqoop, Flume, Ne04j, Machine Learning, java, Nifi, Awz, Azure, GCP, DataBricks, DataMeer, kafka, Confluent, Schema Registry, SQl, DB2, CDC Why should you hire me ? ✅ 1400+ Upwork Hours Completed+ productive hours logged with 100% customer satisfaction » Passion for Data Engineering and Machine Learning » Experience with functional scala: shapeless, cats, itto-csv, neotypes » Familiar with Hadoop ecosystem; Apache Spark, Hive, YARN, Apache Drill, Sqoop, Flume, Zookeeper, HDFS, MapReduce, Machine Learning, airflow » Worked with JWT authentication, reactive JDBC-like connectors for PostgreSQL, MySQL & MariaDB, reactive MongoDB » Micro-services expert. Worked mostly with Lagom; Akka persistence, event-sourcing » Defining a scalable architecture on top of AWS, Google Cloud, Digital Ocean, Alibaba Cloud » ElasticSearch stack pro; ElasticSearch, Logstash, Beats, Kibana » Efficient project manager Let's discuss your idea and build the next big thing!
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Google Cloud Platform
    Apache HBase
    Snowflake
    Machine Learning
    Apache Spark MLlib
    Databricks Platform
    ETL Pipeline
    AWS Glue
    Apache Hive
    Scala
    SQL
    Docker
    Apache Kafka
    Apache Spark
    Apache Hadoop
  • $100 hourly
    — TOP RATED PLUS Freelancer on UPWORK — EXPERT VETTED Freelancer (Among the Top 1% of Upwork Freelancers) — Full Stack Engineer — Data Engineer ✅ AWS Infrastructure, DevOps, AWS Architect, AWS Services (EC2, ECS, Fargate, S3, Lambda, DynamoDB, RDS, Elastic Beanstalk, AWS CDK, AWS Cloudformation etc.), Serverless application development, AWS Glue, AWS EMR Frontend Development: ✅ HTML, CSS, Bootstrap, Javascript, React, Angular Backend Development: ✅ JAVA, Spring Boot, Hibernate, JPA, Microservices, Express.js, Node.js Content Management: ✅ Wordpress, WIX, Squarespace Big Data: ✅ Apache Spark, ETL, Big data, MapReduce, Scala, HDFS, Hive, Apache NiFi Database: ✅ MySQL, Oracle, SQL Server, DynamoDB Build/Deploy: ✅ Maven, Gradle, Git, SVN, Jenkins, Quickbuild, Ansible, AWS Codepipeline, CircleCI As a highly skilled and experienced Lead Software Engineer, I bring a wealth of knowledge and expertise in the areas of Java, Spring, Spring Boot, Big Data, MapReduce, Spark, React, Graphics Design, Logo Design, Email Signatures, Flyers, Web Development (HTML, CSS, Bootstrap, JavaScript & frameworks, PHP, Laravel), responsive web page development, Wordpress and designing, and testing. With over 11 years of experience in the field, I have a deep understanding of Java, Spring Boot, and Microservices, as well as Java EE technologies such as JSP, JSF, Servlet, EJB, JMS, JDBC, and JPA. I am also well-versed in Spring technologies including MVC, IoC, security, boot, data, and transaction. I possess expertise in web services, including REST and SOAP, and am proficient in various web development frameworks such as WordPress, PHP, Laravel, and CodeIgniter. Additionally, I am highly skilled in Javascript, jQuery, ReactJs, AngularJs, Vue.Js, and Node. C#, ASP.NET MVC In the field of big data, I have experience working with MapReduce, Spark, Scala, HDFS, Hive, and Apache NiFi. I am also well-versed in cloud technologies such as PCF, Azure, and Docker. Furthermore, I am proficient in various databases including MySQL, SQL Server, MySql, and Oracle. I am familiar with different build tools such as Maven, Gradle, Git, SVN, Jenkins, Quickbuild, and Ansible.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Apache Spark
    Database
    WordPress
    Cloud Computing
    Spring Framework
    Data Engineering
    NoSQL Database
    React
    Serverless Stack
    Solution Architecture Consultation
    Spring Boot
    DevOps
    Microservice
    AWS Fargate
    AWS CloudFormation
    Java
    CI/CD
    Amazon ECS
    Containerization
  • $25 hourly
     Certification in Big Data/Hadoop Ecosystem  Big Data Environment: Google Cloud Platform, Cloudera, HortonWorks and AWS, SnowFlake, Databricks, DC/OS  Big Data Tools : Apache Hadoop, Apache Spark, Apache Kafka, Apache Nifi, Apache Cassandra, Yarn/Mesos, Oozie, Sqoop, Airflow, Glue, Athena, S3 Buckets, Lambda, Redshift, DynamoDB ,Delta Lake, Docker, GIT, Bash Scripts Jenkins, Postgres, MongoDB, Elastic Search, Kibana, Ignite, TiDB  Certification SQL Server, Database Development and Crystal Report.  SQL Server Tools: SQL Management Studio, BIDS, SSIS, SSAS and SSRS  BI/Dashboarding Tools: Power BI, Tableau, Kibana  Big Data Development Programing Languages: Scala and python. ======================================================================= ************************************* Big Data Engineer**********************************************  Hands on experience with Google cloud platform, Big Query, Google Data Studio and Flow  Developing ETL pipeline for SQL server as well using SSIS.  For Reporting and Analysis using SSIS, SSRS and SSAS cubes.  Having amazing experience with Big data framework and open source technologies (Apache Nifi, Kafka, Spark and Cassandra, HDFS, Hive Docker/Cassandra/ Postgres SQL, Git, Bash Scripts Jenkins, MongoDB, Elastic Search, Ignite, TiDB.  Managing data warehouse Big Data cluster services and developments of Data Flows.  Writing big data/Spark ETL applications for different sources (SQL, Oracle, CSV, XML,JSON) to support different department for analytics.  Extensive work with Hive, Hadoop, Spark, Docker, Apache Nifi  Supporting different department for big data analytics.  Build multiple end to end Fraud monitoring alert based systems.  Preferable language is Scala and python as well. ************Big Data Engineer– Fraud Management at VEON *************  Devolved ETL Pipeline from Kafka to Cassandra using Spark in Scala Language.  Using Big Data Tools with Horton Works and AWS (Apache Nifi, Kafka, Spark and Cassandra, Elastic Search)  Dashboard Developments - Tableau and Kibana.  Writing SQL server complex queries, procedures and Functions.  Developing ETL pipeline for SQL server as well using SSIS.  For Reporting and Analysis using SSIS, SSRS and SSAS cubes.  Developing and designing Auto Email Reports.  Offline Data Analytics for Fraud Detection and Setting up controls for prevention.  SQL Database Development.  System Support of Fraud Management.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Google Cloud Platform
    SQL Programming
    Data Warehousing
    Database
    AWS Glue
    PySpark
    MongoDB
    Python Script
    Docker
    Apache Hadoop
    Apache Spark
    Databricks Platform
    Apache Kafka
    Apache Hive
  • $85 hourly
    - 15 years of experience in Data Science, Data warehouse, Business Intelligence, advanced analytics, ETL/ELT, Data visualization, Virtualization, database programming and data engineering. - Experience in Machine learning, especially on customer360, linear regression and decision trees. - Specialized in the end to end of Business Intelligence and analytics implementations - ER (Entity Relationship) modeling for OLTP and dimension modeling (conceptual, Logical, Physical) for OLAP. - Have experience running startup companies and building SaaS products, including CRM (Customer Relationship Management) and Data Orchestration low-code tools. - Experience working in Agile Scrum and methodologies (2 and 3-week sprints) - Excellent communication skills, a good understanding of business and client requirements - Good at technical documentation, POCs (Proof of Concepts) - Good at discussions with Stakeholders for requirements and Demos - Convert business requirements into Technical design documents with pseudo-code - Dedicated, work with minimal supervision. - Eager to learn new technologies, can explore and learn and develop quickly on client-owned applications. - Expert in SQL, T-SQL, PLSQL, knows advanced functions and features of it, good at database programming. - good at Performance Tuning, clustering, Indexing, Partitioning and other DBA activities. - DBA activities like Database backup/recovery, monitoring Database health, killing Long-running queries and suggesting better tuning options. - Good at database programming and normalized techniques (all 3 normal forms). - Expert in Azure Synapse, PostgreSQL, MongoDB, Dynamo DB, Google Data Studio, Tableau, Sisense, SSRS, SSIS, and more. - Domain knowledge in Telecom, Finance/Banking, Automobile, Insurance, Telemedicine, Healthcare and Virtual Clinical Trials (CT). - Extensive DBA knowledge and work experience in SQL Server, Login management, database backup and restore, monitoring database loads, and tuning methods. - Exceptionally well in Azure ML and regression models Expertise: Database: Snowflake, Oracle SQL and PLSQL (OCP certified), SQL Server, T-SQL, SAP HANA, Azure SQL Database, Azure Synapse Analytics, Teradata, Mysql, No SQL, PostgreSQL, and MongoDB ETL: Azure Data Factory, DBT, SSIS, AWS Glue, Matillion CDC & ETL, Google Big Query, Informatica PC and Cloud, ODI, Data Stage, MSBI (SSIS, SSAS) Reporting/Visualization: Sisense, QlikSense, Sigma Computing, Metabase, Qlikview, SSRS, Domo, Looker, Tableau, Google Data Studio, Amazon QuickSight and PowerBI Scripting Language: Unix, Python and R Cloud Services: Google Cloud Platform (Big Query, Cloud functions, Data Studio), MS Azure (Azure Blob Storage, Azure Functional Apps, Logic Apps, Azure Data Lakehouse, Databricks, Purview, ADF and Microservices), Azure ML, AWS RDS EC2, S3, and Amazon Redshift, Step functions, Data Pipelines Data Virtualization: Denodo
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    C#
    Snowflake
    ETL
    Data Warehousing
    Business Intelligence
    Data Visualization
    Azure Machine Learning
    Qlik Sense
    Looker
    Sisense
    Microsoft Power BI
    SQL
    Tableau
  • $50 hourly
    🗲 Available now 🥇 Top rated ⏲️ 12,000+ Upwork hours ⭐ 100%+ rating ✅ Full-time and Long term 🌎 24x7 support 😝 Fun loving 10 years experience as a DevOps Engineer with extensive experience in all cloud providers like AWS, GCP and Azure My blog : roshannagekar.blogspot.com I have vast experience in the following tools and technologies: ⚙⚙⚙⚙⚙ 🄳🄴🅅🄾🄿🅂 ⚙⚙⚙⚙⚙ ★ Configuration Management: Ansible, Chef, Puppet ★ AWS Administration: EC2, RDS, S3, CloudFront ★ CI Tools: Jenkins, Bamboo ★ Deployment: Fabric, Capistrano, Ansible ★ Provisioning: Terraform, Cloudformation ★ Cloud Providers: AWS, GCP, Azure, Digital Ocean ★ Containerization: Docker Compose, Kubernetes ★ Source Code: Git, SVN, Gitlab, Github ★ Monitoring (Free): Nagios, Zabbix, Prometheus ★ Monitoring (Paid): Newrelic, Datadog, Pingdom ★ On-Call Support: VictorOps, Pagerduty ★ High Availability: HAProxy, F5, ELB, Envoy ★ Scripting: Bash, Python, Ruby ★ Documentation: Confluence, Twiki ★ Infra. Testing: Test-Kitchen, Apache-Benchmark ★ Databases: MySQL, PostgreSQL, Oracle ★ Servers: Apache, Nginx, IIS, vsftpd ★ Operating System: Unix, Redhat/Ubuntu, Windows ★ Virtualization: VirtualBox, VMWare, Vagrant ★ Build Tools: Makefile, Ant, Maven ★ Ticketing: JIRA, HPQC, Bugzilla ★ QA Tools: Selenium, Sikuli, SoapUI ★ File Transfer: Filezilla, WinSCP, s3cmd 🛡🛡🛡🛡🛡 🄸🄽🄵🄾🅂🄴🄲 🛡🛡🛡🛡🛡 ⮕ Kali Linux ⭐⭐⭐⭐ ⮕ Metasploit ⭐⭐⭐ ⮕ Maltego ⭐⭐⭐ ⮕ Burpsuite ⭐⭐⭐ ⮕ Fiddler ⭐⭐⭐⭐ ⮕ Kismet ⭐⭐⭐ ⮕ Wirshark ⭐⭐⭐⭐ ⮕ Zaproxy ⭐⭐⭐⭐ ⮕ Wazuh ⭐⭐⭐⭐⭐ 📅 Devops Event Organizer: 📅 meetup.com/DevOps-Pune/members/?op=leaders
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Software Testing
    DevOps
    WordPress
    MySQL
    Customer Service
    Unix
    Technical Writing
    Google App Engine
    LAMP Administration
    Amazon Web Services
  • $25 hourly
    PROGRAMMING TECHNOLOGY EXPERTISE * Python ,Django, FastAPI,Flask,Selenium,Rest API * React.js, Next.js, Vue.js ,Angular * React Native * Flutter DEVOPS & CLOUD & CYBER SECURITY EXPERTISE * AWS Cloud Solution design and developmpent * Opensearch , Elasticsearch, Kibana, Logstash based setup, configuration and development integration * Ansible * Docker * Jenkins * GitLab Based CI-CD * Prometheus and grafana * SIEM * Surikata/Snort * Bro(zeek) * Hashicorp vault * Cyber Security Project Related development and consultation. * Kong api gateway integration
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Amazon Elastic Beanstalk
    Flutter
    React Native
    PostgreSQL Programming
    ELK Stack
    AWS CloudFront
    Amazon S3
    RESTful API
    AWS Lambda
    DevOps
    Next.js
    React
    Python
    Django
    AWS Amplify
  • $40 hourly
    Seeking for challenging task in design and development of scalable backend infrastructure solutions I work in following domains. 1. ETL Pipelines 2. Data Engieering 3. DevOps & AWS (Amazon Web Services) and GCP (Google Cloud Development) deployment 4. Machine Learning I mainly design all solutions in Python. I have 10+ years of experience in Python. I have extensive experience in following frameworks/libraries - Flask, Django, Pandas, Numpy, Django, PyTorch, Scrapy and many more Regarding ETL Pipelines, I mainly provide end to end data pipelines using AWS/GCP/Custom Frameworks. I have more than 7+ years of experience in this domain. I have strong command in Scrapy and have done more than 300+ crawlers till date. Regarding Data Warehousing, I have extensive experience in Google BigQuery and AWS RedShift. I have hands on experience in handling millions of data and analyze them using GCP and AWS data warehousing solutions. I have 5+ years of experience in designing Serverless Applications using AWS and GCP. In addition to this, I am hands on bunch of services on GCP and AWS Cloud and provide efficient and cost effective solution over there.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Data Analysis
    Apache Spark
    PySpark
    ChatGPT
    Generative AI
    AWS Glue
    Google Cloud Platform
    BigQuery
    Snowflake
    Kubernetes
    Django
    Docker
    Serverless Stack
    Python
    Scrapy
    Data Scraping
    ETL Pipeline
  • $40 hourly
    Hi, I am Isha Taneja, highly skilled in Data Analytics, Engineering & Cloud Computing from Mohali, India. I am an expert in creating an ETL data flow in Talend Studio, Databricks & Python using best design patterns and practices to integrate data from multiple data sources.  I have worked on multiple projects which require Data migration Data Warehousing development and API Integration. Expertise: 1. Migration - Platform Migration - Legacy ETL to Modern Data Pipeline / Talend / ERP Migration / CRM Migration - Data Migration - Salesforce migration / Hubspot Migration / Cloud Migration / ERP Migration 2. Data Analytics - Data Lake Consulting - Data Warehouse Consulting - Data Modelling / Data Integration / Data Governance / ETL - Data Strategy - Data Compliance / Data Deduplication / Data Reconciliation / Customized Data - Processing Framework / Data Streaming / API implementation / Data Ops - Business Intelligence - Digital marketing Analysis / E-commerce Analytics / ERP Reporting Capabilities - Big Data - Lakehouse Implementation 3. Software QA & Testing 4. Custom Application Development - UI/UX - Frontend Development - Backend Development 5. Cloud - Cloud-native Services/ AWS Consulting / Cloud Migration /Azure Consulting / Databricks /Salesforce 6. Business process automation - Bi-directional sync between applications / RPA A Data Professional and a ETL Developer with 10+ years of experience working with enterprises/clients globally to define their implementation approach with the right Data platform strategy, Data Analytics, and business intelligence solutions. My domain expertise lies in E-Commerce, Healthcare, HR Related, Media & Advertising, Digital Marketing You have the data? Great !! I can help you analyze it using Python. It involves performing exploratory data analysis, hypothesis testing, and data visualization. You have Big Data? Even Better !! I can help you clean, transform, store and analyze it using big data technologies and production it using cloud services like AWS and Azure Cloud. You want to track business KPIs and metrics? No Problem !! I can even help you develop reports using Tableau and PowerBI; this will always keep you ahead in your business. Specialities: Databases: Snowflakes, Postgres, Dynamo DB, Graph DB - Neo4j, Mongo DB, Data Warehouse concepts, MSSQL ETL-Tools: Talend Data Integration Suite, Matillion, Informatica, Databricks API Integration - Salesforce / Google Adwords / Google Analytics / Marketo / Amazon MWS - Seller Central / Shopify / Hubspot / FreshDesk / Xero Programming: Java, SQL, HTML, Unix, Python, Node JS, React JS Reporting Tools: Yellowfin BI, Tableau, Power BI, SAP BO, Sisense, Google Data Studio AWS Platform: S3, AWS Lambda, AWS Batch, ECS, EC2, Athena, AWS Glue, AWS Step Functions Azure Cloud Platform. Other Tools: Airflow Expect Integrity, Excellent communication in English, technical proficiency, and long-term support.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Databricks MLflow
    Databricks Platform
    Tableau
    Microsoft Power BI
    Data Extraction
    Talend Data Integration
    Data Analysis
    Microsoft Azure
    Continuous Integration
    AWS Lambda
    API
    Database
    Python
    SQL
    ETL
  • $125 hourly
    🏆 Achieved Top-Rated Freelancer status (Top 10%) with a proven track record of success. Past experience: Twitter, Spotify, & PwC. I am a certified data engineer & software developer with 5+ years of experience. I am familiar with almost all major tech stacks on data science/engineering and app development. If you require support in your projects, please do get in touch. Programming Languages: Python | Java | Scala | C++ | Rust | SQL | Bash Big Data: Airflow | Hadoop | MapReduce | Hive | Spark | Iceberg | Presto | Trino | Scio | Databricks Cloud: GCP | AWS | Azure | Cloudera Backend: Spring Boot | FastAPI | Flask AI/ML: Pytorch | ChatGPT | Kubeflow | Onnx | Spacy | Vertex AI Streaming: Apache Beam | Apache Flink | Apache Kafka | Spark Streaming SQL Databases: MSSQL | Postgres | MySql | BigQuery | Snowflake | Redshift | Teradata NoSQL Databases: Bigtable | Cassandra | HBase | MongoDB | Elasticsearch Devops: Terraform | Docker | Git | Kubernetes | Linux | Github Actions | Jenkins | Gitlab
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Java
    Apache Hadoop
    Amazon Web Services
    Snowflake
    Microsoft Azure
    Google Cloud Platform
    Database Management
    Linux
    Apache Spark
    ETL
    API Integration
    Scala
    SQL
    Python
  • $55 hourly
    Helping to build Analytics-Driven Organizations through Data and Machine Learning Model. 🏆 TOP-RATED PLUS 🏆 100% Job Success Score 🏆 500+ Dashboards Delivered for 45+ different industries 🏆 150+ Automated Data Pipeline Developed 🏆 175+ Satisfied Clients Worldwide 🏆 All 5 Star Ratings and Stellar Reviews 🏆 Fastest growing on Upwork in last one year Most Business Leaders struggle taking faster and crucial decisions for their businesses. For the last 14+ years, I helped Business Leaders and CXOs across the globe 🌏 to take quicker and critical decisions by empowering them with access to Real-Time Actionable Insights by developing Automated & Real-time visually appealing interactive dashboards on Power BI and Tableau. Some of my 🌟 High Profile Clients 🌟 include Johnson & Johnson, Coco-Cola, Primo Water, Uflex, Ceribell, Mountenvil, Nourshed, Microtek I provide end-to-end solutions starting from developing automated data pipelines to pull data from all the data points, setting up Data Lake and Data Warehouse, and creating business and function-specific Reports in Microsoft Power BI and Tableau: Here’s a highlight of the services we commonly help our clients with: ✅ Data Analytics Solutions & Business Intelligence Implementation (Google, AWS, GCP (Google Cloud Platform), & Microsoft Azure) ✅ Data Visualization & Dashboard Design / Development (World-class customer experience) ✅ Machine Learning Model and Artificial Intelligence ✅ Do performance Monitoring, fine-tuning, and optimization of dashboards. ✅ Connect to multiple data sources, import data, and make it useful for your business ✅ Perform advance level calculations on data ✅ Publish and share the dashboards, and schedule data refreshes. ✅ Set up ETL processes ✅ Building Automated data pipelines for pulling data from multiple platforms through APIs ✅ Data Warehouse & Data Engineering Services 🛠 Tools 🛠 Power BI, Tableau, SQL, Python, AWS, GCP, Azure, Pyspark, ETL, Data Lake, Data Warehouse, Data Science, Tensorflow, Pandas, Keras, Numpy, pyTorch, OpenAI, Apache, Prophet, Excel, VBA, etc. ⭐️⭐️⭐️⭐️⭐ ❝ NamaSYS team was amazing to work with! Their work ethic, business insight, and communication skills were top-notch! Power BI can be confusing at times and they made the process very simple and understandable for me. It was amazing to me that after just a couple of conversations, they understood our business almost better than ourselves! Would recommend it to anyone and everyone.❞ 🗣 Why you should hire me: Unrivaled Performance in helping CXOs make quick and informed data-based decisions, which will eventually help businesses thrive in every situation. It's all about the result! ═════ How do my clients feel? (Project Reviews) ═════ ⭐️⭐️⭐️⭐️⭐Senior Power BI Developer, Data Lake, and Data Warehouse "The team is absolutely fantastic! They took an enormous task and completed it amazingly. All the while being a pleasure to work with and make every effort to be available whenever it was needed. I will continue working with this team as much as I can! Thank you all so much!!" ⭐⭐⭐⭐⭐ 2 Power BI experts required within 2-4 weeks for 2 months "Five stars all the way. I would certainly consider working again with Kulbhushan. Great attitude and great attention to detail. Very professional and very impressive work ethics." ⭐⭐⭐⭐⭐ Power BI Multiple Retail Store Dashboard, Reporting, and Statistics "Absolutely professional and a great team to work with. They are very creative, well-versed, and delivered well beyond our expectations. Please reach out to us if you need more details or a reference. We will definitely use them again in the very near future." ✔Check out 5 Star ratings and Awesome feedback from our long-term clients. If you are looking forward to: - taking your business to the next level by taking a faster decision based on real-time actionable insights; - thriving your business in every situation by harnessing the power of data; - Optimizing cost, by knowing all the grey areas of your business; - Creating a culture of accountability, transparency, and efficiency; - Outwitting your competitors by utilizing your resources most efficiently; 📞 Get in touch 📞 📞 Free 30-Minutes Consultations 📞 Invite me to your job and I’ll tell you exactly how we can help you achieve your goals. ✅ HOURLY RATE NOTE: My profile advertised rate is a rough reflection of what you can expect to pay when working with me long-term.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Business Intelligence
    Artificial Intelligence
    Machine Learning
    Microsoft Azure
    Microsoft Power Automate
    Microsoft PowerApps
    Data Warehousing & ETL Software
    Data Science
    Data Visualization
    Microsoft Power BI
    AWS Glue
    Python
    Tableau
    ETL Pipeline
    SQL
  • $45 hourly
    • 9+ years of data product development experience including 5+ years of experience in big data engineering development along with 7+ years of experience in data Engineering, data warehousing and business Intelligence. • Good Experience building systems to perform real-time data processing using spark streaming, Kafka, spark sql, pyspark and cloudera. • Worked extensively with dimensional modeling, data migration, data cleansing, data profiling, and ETL processes features for data lake and data warehouse. • Design and build ETL pipelines to automate ingestion of structured and unstructured data in batch and real time mode using Nifi, Kafka, spark sql, spark streaming, hive, Impala and different ETL tools. • Worked with multiple ETL tools like Informatica Big Data Edition 10.2.2., Alteryx, Talend, Kalido. • Good knowledge of Azure Databrick, Azure HDInsight, ADLS, ADF and Azure storage Analyzed and processed complex data sets using advanced querying, visualization, and analytics tools.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Big Data
    Bash Programming
    ETL
    Data Analysis
    Sqoop
    SQL
    Java
    Python
    Informatica
    Apache Hive
  • $40 hourly
    I am Aliabbas Bhojani, Data Engineer with profound knowledge and experience in the core functionality of Data Engineering, Big Data Processing and Cloud Data Architecture. I have completed by Bachelor in Engineering with the specialisation in Computer Engineering which has helped me to target complex data problems and have proved my expertise by suggesting the high performant cloud data architecture which can help to scale the business. I'm very familiar with a wide variety of web platforms and infrastructure, so don't be afraid to run something by me for things like Apache Spark, Apache NiFi, Kafka, Apache Accumulo, Apache Base, Zookeeper, REST APIs, Java, Python, Scala and JavaScript. I can work on your on-prem or cloud deployed solution so whether it's setting up Kubernetes, Docker, VMs on Azure, Amazon Web Services(AWS) or Google Cloud Platform(GCP). Wide Spectrum of Offering: - Data Engineering Core Values - Data Driven Business Intelligence - Automated Real Time Data Pipelines - Advance Machine Learning based Data Analytics - Relational and Non Relational Data Modelling - Cloud native data products - Big Data Handling with Apache Spark and Apache NiFi - Open Source Data Tools Usage and Mindset - AWS Cloud Data Architecture and Engineering - Azure Cloud Data Architecture and Engineering - GCP Cloud Data Architecture and Engineering - Scaling Data Pipelines with Kubernetes and Docker - No Down Time Data Pipeline using Cloud Agnostic Approach Feel free to reach out in terms of any inquiries and project discussion Aliabbas Bhojani
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Snowflake
    Cloud Architecture
    Data Lake
    Apache Accumulo
    ETL
    DevOps
    Machine Learning
    PySpark
    Apache Spark
    Python
    Java
    SQL
    Data Engineering
    Apache Hadoop
  • $50 hourly
    JNCIE-SP #2975 , JNCIS-SEC, JNCIS-DevOps, CCNA R&S, LPI-1. Network Expert with more then 10 years of experience in the telecommunications industry. During my career I worked for 2 biggest ISP's in my country and one Tier1 ISP as part of Central Network Support Team(Level 3 Support). Besides of this, I also had a lot of other project, which involved network design, configuration,implementation and network automation. I have deep knowledge with Juniper Networks(JunOS), Cisco Systems(IOS, IOS-XR), Huawei Networking(VRP) and Mikrotik(RouterOS) products. During Network Engineer career, I was responsible for all kind of networking tasks, beginning from customer services or device configuration and ending with developing,extension of IP/MPLS network, traffic engineering or other technologies used in big ISP network. Also, usually I was responsible for testing and integration of new technologies in existing network without interruption of services. And the last, but not the least, I was responsible for providing and integration of network automation tools and monitoring systems. Experienced with Juniper routers and switches, MX, EX and SRX. Experienced with Cisco IOS routers and switches, 2900,6500​ and 7200 series. Experienced with Cisco IOS-XR routers, especially with ASR9k. Experienced with Huawei and Raisecom PON network devices, OLT/ONT. Experienced with Foundry/Brocade/EdgeCore/Arista switches. Experienced with Ericsson SmartEdge series router. Experienced with Mikrotik routers Experienced with NMS like U2000, Zabbix, Cacti and The Dude. Experienced with automation tools like Ansible and Python scripting. Experienced with Linux Systems - CentOS / Ubuntu Expert knowledge of xSTP, PIM, IS-IS, OSPF, BGP, MPLS, LDP/RSVP, MPLS VPNs(L2/L3/VPLS) etc...
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Extreme Networks
    Junos OS
    Cisco Certified Internetwork Expert
    Network Analysis
    Cisco Certified Network Professional
    Python
    Cisco Router
    OSPF
    Network Engineering
    Multiprotocol Label Switching
    Cisco IOS
    Cisco Certified Network Associate
    Multiprotocol BGP
    Juniper
  • $80 hourly
    A Design Thinker, Product Manager and an AI Solutions Architect with 10+ years of building both B2B and B2C data first products and apps, big data solutions and machine learning algorithms. Versatile experience working with big brands, startups and emerging unicorns. Proficiencies: - Product Strategy & Roadmap - Product Market fit & Competitor Research - Product Design - Wire-framing, UX and usability research - Rapid prototype development - Product refreshes - Agile project management - Scrum, Kanban & XP; wearing the Product Owner & Scrum Master hats - Creating Epics, User Stories, acceptance criteria - Very familiar with deployment strategies, clouds and technical architectures across AWS, Azure & GCP; across monolithic to microservices architectures; and across RDBMS to noSQL and graph databases (from my data sciences background) - Modern PM tools: Jira, Trello, Asana, Aha! and more. - Native comfort with SQL, R and other data and analysis languages. (English too!) Product Portfolio: Auto AI, NashWorks (Enterprise AutoAI Platform) As lead engagement architect, I led the design, development and delivery of a generic, fully automated AI platform for enterprises to plug in their data and deploy algorithms around predictive forecasting for demand & sales for a seed-stage startup. This platform is in the customer beta phase. VIDULEARN, Cerebronics (Classroom Intelligence Platform) As Product Owner, I led the platform design team towards building the next big classroom intelligence platform- fusing concepts of lecture capture and blended learning with advanced video processing technologies and state-of-the-art AI for content recognition and summarization to provide a holistic augmented classroom experience for students and teachers. MYNTELLIGENCE, NashWorks (Marketing Intelligence Platform) As Lead Product Manager, I led the buildout of a new-age cross channel marketing intelligence platform integrating visual campaign management, automated & custom analytics, AI driven optimization recommendations and multi-channel segmentation capabilities from conception to launch and helped the client hit $500k annualized revenues within the beta phase. VIDEOTELLIGENT, Cerebronics (Video Resyndication Platform) As Product Architect, I led the design and development of a nordic video resyndication platform aimed at creating a marketplace for content creators, publishers, consumers and advertisers to come together in a three-way syndication platform with multiple AI assisted virality driven commercial models. MS ASSIST, Harman (Global Admin Assistance Chatbot Platform) As the Platform Architect, I led the development of a chatbot platform for Microsoft’s global security & administration division incorporating the ability for multiple teams within the division to visually create and deploy their own chatbots to fulfill employee services responsibilities that they owned. Scaled the platform to 30+ chatbots handling 10000+ sessions/day with accretive time-value cost savings estimated at more than $1 million / month MS GSOC, Harman (Global Security Management using IoT) As the Platform Architect, I led the design and development of Microsoft’s global physical security monitoring platform, bringing 3000+ locations with 75000+ cameras, 15000+ stenofones and 150,000+ RFID readers online into the platform with real time asset monitoring and data warehousing for further analytical solutions. LOCATE, MediaIQ (Location Intelligence & Targeting Platform) I started (and lead) this Campaign Targeting & Insights Product, tying in the offline (real) world to online and ever-increasing mobile worlds. Managed 5 billion records/day and hit $1 Mil/Month in Revenue within 6 months of launch. ELEVATE, MediaIQ (Measurement & Insights Framework for Qualitative Ad response and campaign performance) I conceptualized and built the first version of this segment-first qualitative measurement & insights product. I lead the development & scaling team for it, handling 25+ billion records/day and hitting $1 mil/month in revenue within 8 months of launch. MACRO, MediaIQ (Multi Source Data Stitching Platform for Intelligent, Performant Ad Campaigns) I built the data ingestion, storage, analysis and information retrieval architectures, connecting over 50 datasets with petabyte scale RTB data to empower traders and analysts deliver data driven performant ad campaigns. Managed and ran queries on dataset over 30 petabytes scaling over trillions of records and thousands of columns. MUSTANG, Mu Sigma (Real Time Text Analytics platform) I built the agent driven real time analytics stack and algorithms for processing and analyzing text data for insights, to be deployed within a JADE platform.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    User Experience Design
    Big Data
    Data Science
    R
    Business Intelligence
    Minimum Viable Product
    Product Management
    Data Analysis
    Product Strategy
    PostgreSQL
    SQL
    Statistics
    Product Design
    Design Thinking
    Quantitative Analysis
    Lean Startup
    Demo Presentation
  • Want to browse more freelancers?
    Sign up

How it works

1. Post a job (it’s free)

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How to Hire Top Apache NiFi Specialists

How to hire Apache NiFi experts

Knowledge is power, and the key to unlocking this power lies in your ability to track and manage the flow of data within your business or organization. An Apache NiFi expert can help you set up the data routing, transformation, and system mediation logic you need to master your data. 

So how do you hire Apache NiFi experts? What follows are some tips for finding top Apache NiFi experts on Upwork. 

How to shortlist Apache NiFi consultants

As you’re browsing available Apache NiFi consultants, it can be helpful to develop a shortlist of the independent professionals you may want to interview. You can screen profiles on criteria such as:

  • Technology fit. You want an Apache NiFi expert who understands the technologies behind your products and services so they can design custom dataflow solutions for your business. 
  • Workflow. You want an Apache NiFi expert who can slide right into your existing developer workflow (e.g., Jira, Slack).
  • Feedback. Check reviews from past clients for glowing testimonials or red flags that can tell you what it’s like to work with a particular Apache NiFi expert.

How to write an effective Apache NiFi job post

With a clear picture of your ideal Apache NiFi expert in mind, it’s time to write that job post. Although you don’t need a full job description as you would when hiring an employee, aim to provide enough detail for a consultant to know if they’re the right fit for the project. 

An effective Apache NiFi job post should include: 

  • Scope of work: From tracking dataflows to creating loss-tolerant data delivery systems, list all the deliverables you’ll need. 
  • Project length: Your job post should indicate whether this is a smaller or larger project. 
  • Background: If you prefer experience working with certain industries, software, or technologies, mention this here. 
  • Budget: Set a budget and note your preference for hourly rates vs. fixed-price contracts.

Ready to automate data flow within your organization? Log in and post your Apache NiFi job on Upwork today.

APACHE NIFI SPECIALISTS FAQ

What is Apache NiFi?

Apache NiFi provides data scientists and engineers with a web-based user interface for designing and monitoring dataflows within an organization. Apache NiFi experts can automate many of the configuration and data processing tasks associated with moving data from one place to another.

Here’s a quick overview of the skills you should look for in Apache NiFi professionals:

  • Apache NiFi
  • Data science and/or data engineering
  • Big data tech such as Hadoop, Spark, and AWS
  • Back-end languages such as Python and SQL

Why hire Apache NiFi experts?

The trick to finding top Apache NiFi experts is to identify your needs. Are you looking to connect raw marketing data logs to an Amazon Kinesis Data Firehose end point for real-time marketing analytics? Or do you need help directing data from a fleet of IoT devices to your SaaS platform? 

The cost of your project will depend largely on your scope of work and the specific skills needed to bring your project to life. 

How much does it cost to hire an Apache NiFi consultant?

Rates can vary due to many factors, including expertise and experience, location, and market conditions.

  • An experienced Apache NiFi consultant may command higher fees but also work faster, have more-specialized areas of expertise, and deliver higher-quality work.
  • A consultant who is still in the process of building a client base may price their Apache NiFi services more competitively. 

Which one is right for you will depend on the specifics of your project.

View less
Schedule a call