Hire the best Apache NiFi developers

Check out Apache NiFi developers with the skills you need for your next job.
  • US$95 hourly
    => Let's Connect Hello, I'm Dima, a seasoned CyberSecurity Specialist and Turnkey Infrastructure Expert specializing in BigData solutions and data analysis, utilizing a DevOps approach. => Expertise Overview With a robust passion for constructing SOC, SOAR, and SIEM solutions, my primary focus lies in developing data ingestion, enrichment, and analysis pipelines, ensuring they are highly available and fault-tolerant. My expertise extends to building central logging and real-time processing platforms from the ground up, optimizing them for performance, security, and reliability across multiple environments, whether in the cloud or on-premise. => Value Proposition My commitment is to deliver solutions that not only centralize security and threat intelligence but also facilitate enhanced control over data, ultimately contributing to infrastructure cost savings. => Technological Summary CyberSecurity:------- > Wazuh, Suricata, pfSense BigData:--------------- > Kafka, ElasticSearch, OpenSearch Data Processing:----- > FluentD, Vector.dev, Apache NiFi Infra as Code:--------- > Terraform, cdktf, cdk8s Virtualization:--------- > Proxmox, VMware Containerization:----- > Kubernetes Clouds:---------------- > AWS, Hetzner, DigitalOcean, Linode Automation:----------- > Jenkins, GitHub Actions Monitoring:----------- > Zabbix, Grafana, Kibana, Prometheus, Thanos Mail:--------------------> MailCow SMTP/IMAP, Postfix VPN:------------------- > OpenVPN Server Programming:-------- > Bash, Python, TypeScript Operating Systems:- > CentOS, RHEL, Rocky Linux, Ubuntu, Debian => Personal Attributes • Leadership: Leading by example with a team-first approach • End-to-End Execution: Proficient from POC to Enterprise-level implementation • Resilience: Demonstrating high thoroughness and endurance • Adaptability: A quick, can-do architect and experienced troubleshooter • Optimization: Adept in process and performance optimization • Documentation: Skilled technical documentation writer • Vision: A visionary in technological implementation and solution provision
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Elasticsearch
    Linux System Administration
    Apache Kafka
    Apache Hadoop
    Email Security
    Machine Learning
    ELK Stack
    Cloudera
    Zabbix
    MySQL
    Big Data
    PfSense
    Red Hat Administration
    Proxmox VE
    Amazon Web Services
  • US$30 hourly
    Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data Catalog
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    SQL
    AWS Glue
    PySpark
    Apache Cassandra
    ETL Pipeline
    Apache Hive
    Apache Kafka
    Big Data
    Apache Hadoop
    Scala
    Apache Spark
  • US$175 hourly
    Mr. Joshua B. Seagroves is a seasoned professional having served as an Enterprise Architect/Senior Data Engineer for multiple Fortune 100 Companies. With a successful track record as a startup founder and CTO, Mr. Seagroves brings a wealth of experience to his role, specializing in the strategic design, development, and implementation of advanced technology systems. Throughout his career, Mr. Seagroves has demonstrated expertise in architecting and delivering cutting-edge solutions, particularly in the realm of data engineering and sciences. He has successfully spearheaded the implementation of multiple such systems and applications for a diverse range of clients. As part of his current responsibilities, Mr. Seagroves actively contributes to the prototyping and research efforts in the field of data engineering/data science, specifically in the development of operational systems for critical mission systems. Leveraging his extensive background in architecture and software modeling methodologies, he has consistently led and collaborated with multidisciplinary teams, successfully integrating various distributed computing technologies, including Hadoop, NiFi, HBase, Accumulo, and MongoDB. Mr. Seagroves' exceptional professional achievements and extensive experience make him a highly sought-after expert in his field. His comprehensive knowledge and hands-on expertise in advanced technology systems and big data make him a valuable asset to any organization.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    YARN
    Apache Hadoop
    Big Data
    Apache Zookeeper
    TensorFlow
    Apache Spark
    Apache Kafka
    Artificial Neural Network
    Artificial Intelligence
  • US$85 hourly
    FHIR, HL7v2, HL7v3, C-CDA, Carequality HIE, Mirth Connect, Apache NiFi, MU, EDI, EHR/EMR Functional Model. Back-end – JavaScript, Java, SQL, SmileCDR, Aidbox. • Certified HL7 FHIR R4 Proficiency • Certified HL7v2 Control Specialist • Certified HL7 CDA Specialist • Certified HL7v3 RIM Specialist • IHE Certified Professional - Foundations • SNOMED CT Terminology Services certified --- FHIR --- * Mirth Connect based FHIR integration, FHIR Server on Mirth (HAPI FHIR library). * FHIR (RESTful, event-based messaging and documents paradigms, profiling with Forge). * HL7v2 to/from FHIR mapping (e.g., ADT, ORU, OML message types). * C-CDA Level 1, Level 3 to/from FHIR mapping. * FHIR tools (Touchstone, Simplifier, Smile CDR, Forge). * Canadian Core/Baseline FHIR profiles editor. * IG Publishing (IG Publisher, FSH - FHIR Shorthand, SUSHI). * Apache NiFi custom FHIR processors. --- CMS Compliance --- * US Core profiles / IPS profiles / CA Baseline profiles * CARIN Blue Button / CMS Blue Button 2.0 * Da Vinci PDEX Plan Net * Da Vinci PDEX US Drug Formulary * Da Vinci Payer Data Exchange (ePDx) --- HL7 --- * Mirth Connect based HL7v2/HL7v3 integration. * Apache NiFi custom HL7v2 processors * HL7v2 conformance profiles (documentation quality level 3). * Refined and constrained versions of HL7v3 interactions based on Visio models. * HL7v3 messaging (Batch wrapper, Query Infrastructure; Claims, Lab, Patient Administration, Personnel Management domains). * Conformance testing of HL7v2.x/HL7v3 interactions implementation. * Development of HL7v2.x and HL7v3 specifications and Implementation Guides using Messaging Workbench, RMIM Designer, V3Generator. * Canadian HIAL, OLIS, HRM interfaces. --- C-CDA (Consolidated CDA) --- * CDA parsing library for Mirth Connect. * Document templates (e.g., CCD, NHSN CDA). * Mirth Connect based C-CDA document templates implementation and transformation. * Development of CDA templates specifications (Level 2, Level 3 section and entry templates). * CDA document templates modeling (MDHT Modeling or ART-DECOR). * Conformance testing of C-CDA documents. --- EHR / EMR / PHR --- * Software development of EMR solutions (using Mirth Connect, Java, JavaScript, XML Schema, XSLT, Schematron). * HL7 EHR System Functional Model and Profiles (e.g., Meaningful Use Functional Profile for ONC/NIST Test Procedures, HL7 PHR System FM). --- IHE ITI Profiles --- * Carequality HIE (XCPD, XCA) * OpenHIE * IHE profiles specifications and development: XDS, XDS.b, XDS-I.b, XCA, XCPD, MPQ, DSUB, XDM. * IHE HL7v3 profiles: PIXv3, PDQv3. * IHE FHIR profiles: MHD, PIXm, NPFSm, PDQm, mRFD. * Audit and security domains: ATNA, BPPC, IUA, XUA. Experience with: SmileCDR, Carequality, Quest, LabCorp, AllScripts, eClinicalWorks (eCW), CRISP, MUSE, OpenHIE, etc.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    API Development
    Electronic Data Interchange
    FHIR
    Java
    Mirth Connect
    Health Level 7
    Electronic Medical Record
    HIPAA
    ECMAScript for XML
    XSLT
    XML
    API Integration
    JavaScript
  • US$18 hourly
    Who am I: More than 20 years of mastering and universal knowledge gives me the ability to combine a wide range of technology, make complicated heterogeneous systems and automate business processes. Last 5 years I have worked with Alfresco Content Services. My greatest value is universality. I deeply understand technology starting from the electron crossing Fermi level in semiconductors to the business process automation in the organizational structure of large companies. That's why, I can find out almost any integration solution. What we can do: We can make a deployment of Alfresco Content Services in test, development and production environments. Upgrade and migrate it from previous versions. Create a backup and disaster recovery plans. We can integrate it in the user environment, synchronise users with a centralised authentication management system, make SSO login, choose document access, editing, OCR technologies e.t.c. Integrate Alfresco in your corporate ecosystem, applications, api gateways, databases. We can make customer data models, add document classifications, and additional metadata. Create a business process automation of document management. We can create a production environment for any application with Docker/Kubernetes, a development environment with version control and automated CI/CD pipelines with, for exampl, on-promise Gitlab or in the GCP. Short list of base technologies: - Docker, Docker Compose, Kubernetes... - Linux, Debian, Bash, Python... - Git, Gitlab, CI/CD, DevOps... - Nginx, proxy, DNS, SMTP... - SSL/TLS, Kerberos, SSO, SALM... - Java, Javascript, SQL, PostgreSQL, CMIS... - Apache, Tomcat, NiFi, Elasticsearch/Kibana, WSO2, QGIS... - Google Cloud Platform, any cloud and hosting solutions....
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Docker
    Kubernetes
    JavaScript
    Docker Compose
    Elasticsearch
    Apache Tomcat
    Alfresco Content Services
    Kerberos
    SSL
    Java
    Linux System Administration
    Linux
    Google Cloud Platform
  • US$75 hourly
    As a freelancer and data engineer, I concentrated on databases and ETL related projects, queries performance, and database structure optimization. Work with many types of databases for more than 15 years, primarily with PostgreSQL, MySQL, MS SQL, Redshift, but on projects work with many others, such as Snowflake, Cloudera, DB2, Oracle. Big portfolio of ETL projects with Talend Data Integration, NiFi. Certified Talend Developer (Data Integration, BigData) Microsoft Certified Professional Continuously extend my expertise and knowledge. Open for new challenges.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Snowflake
    SQL Programming
    Amazon Redshift
    Data Analysis
    Microsoft SQL Server Programming
    SQL
    Database Design
    Data Migration
    PostgreSQL
    Database Administration
    ETL
    MySQL
    Talend Open Studio
  • US$200 hourly
    Big Data, Machine learning, and Data Science is my passion. In that field, I have focused mostly on NLP, NLU, OCR / handwriting recognition. I love to teach and train people in the area, and how it can be used.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Database
    SQLAlchemy
    Apache Hadoop
    Big Data
    SQL Server Integration Services
    SQL Server Reporting Services
    Microsoft SQL Server
    SQL
    Machine Learning
    Apache Spark
  • US$110 hourly
    Top-rated developer working (mostly) with big data, artificial intelligence, machine learning, analytics & back-end architecture. I am specialized in Bigdata (Hadoop, Apache Spark, Sqoop, Flume, Hive, Pig, Scala, Apache Kudu, kafka, python, shell scripting core Java, Machine Learning). As a Big Data architect I work as part of a team responsible for building, designing application for online analytics, Outgoing, motivated team player eager to contribute dynamic customer service, administrative, supervisory, team building, and organizational skills towards supporting the objectives of an organization that rewards reliability, dedication, and solid work ethics with opportunities for professional growth. skillSet: Hadoop,spark, scala, python, bash, Tableau,jenkins, Ansible,Hbase, Sqoop, Flume, Ne04j, Machine Learning, java, Nifi, Awz, Azure, GCP, DataBricks, DataMeer, kafka, Confluent, Schema Registry, SQl, DB2, CDC Why should you hire me ? ✅ 1400+ Upwork Hours Completed+ productive hours logged with 100% customer satisfaction » Passion for Data Engineering and Machine Learning » Experience with functional scala: shapeless, cats, itto-csv, neotypes » Familiar with Hadoop ecosystem; Apache Spark, Hive, YARN, Apache Drill, Sqoop, Flume, Zookeeper, HDFS, MapReduce, Machine Learning, airflow » Worked with JWT authentication, reactive JDBC-like connectors for PostgreSQL, MySQL & MariaDB, reactive MongoDB » Micro-services expert. Worked mostly with Lagom; Akka persistence, event-sourcing » Defining a scalable architecture on top of AWS, Google Cloud, Digital Ocean, Alibaba Cloud » ElasticSearch stack pro; ElasticSearch, Logstash, Beats, Kibana » Efficient project manager Let's discuss your idea and build the next big thing!
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Google Cloud Platform
    Apache HBase
    Snowflake
    Machine Learning
    Apache Spark MLlib
    Databricks Platform
    ETL Pipeline
    AWS Glue
    Apache Hive
    Scala
    SQL
    Docker
    Apache Kafka
    Apache Spark
    Apache Hadoop
  • US$25 hourly
     Certification in Big Data/Hadoop Ecosystem  Big Data Environment: Google Cloud Platform, Cloudera, HortonWorks and AWS, SnowFlake, Databricks, DC/OS  Big Data Tools : Apache Hadoop, Apache Spark, Apache Kafka, Apache Nifi, Apache Cassandra, Yarn/Mesos, Oozie, Sqoop, Airflow, Glue, Athena, S3 Buckets, Lambda, Redshift, DynamoDB ,Delta Lake, Docker, GIT, Bash Scripts Jenkins, Postgres, MongoDB, Elastic Search, Kibana, Ignite, TiDB  Certification SQL Server, Database Development and Crystal Report.  SQL Server Tools: SQL Management Studio, BIDS, SSIS, SSAS and SSRS  BI/Dashboarding Tools: Power BI, Tableau, Kibana  Big Data Development Programing Languages: Scala and python. ======================================================================= ************************************* Big Data Engineer**********************************************  Hands on experience with Google cloud platform, Big Query, Google Data Studio and Flow  Developing ETL pipeline for SQL server as well using SSIS.  For Reporting and Analysis using SSIS, SSRS and SSAS cubes.  Having amazing experience with Big data framework and open source technologies (Apache Nifi, Kafka, Spark and Cassandra, HDFS, Hive Docker/Cassandra/ Postgres SQL, Git, Bash Scripts Jenkins, MongoDB, Elastic Search, Ignite, TiDB.  Managing data warehouse Big Data cluster services and developments of Data Flows.  Writing big data/Spark ETL applications for different sources (SQL, Oracle, CSV, XML,JSON) to support different department for analytics.  Extensive work with Hive, Hadoop, Spark, Docker, Apache Nifi  Supporting different department for big data analytics.  Build multiple end to end Fraud monitoring alert based systems.  Preferable language is Scala and python as well. ************Big Data Engineer– Fraud Management at VEON *************  Devolved ETL Pipeline from Kafka to Cassandra using Spark in Scala Language.  Using Big Data Tools with Horton Works and AWS (Apache Nifi, Kafka, Spark and Cassandra, Elastic Search)  Dashboard Developments - Tableau and Kibana.  Writing SQL server complex queries, procedures and Functions.  Developing ETL pipeline for SQL server as well using SSIS.  For Reporting and Analysis using SSIS, SSRS and SSAS cubes.  Developing and designing Auto Email Reports.  Offline Data Analytics for Fraud Detection and Setting up controls for prevention.  SQL Database Development.  System Support of Fraud Management.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Google Cloud Platform
    SQL Programming
    Data Warehousing
    Database
    AWS Glue
    PySpark
    MongoDB
    Python Script
    Docker
    Apache Hadoop
    Apache Spark
    Databricks Platform
    Apache Kafka
    Apache Hive
  • US$175 hourly
    As a skilled data engineer, developer, and biomedical engineer, I have spent the last several years honing my expertise to offer top-tier services to clients. My speciality is in data engineering, including data warehousing, ETL optimization, data modeling, data governance, and data visualization. I also have a strong background in machine learning and MLOps. This has allowed me to develop a command of a wide range of technologies, including Hadoop, DataBricks, Docker, Terraform, Apache Spark, Collibra, big data, business intelligence tools, cloud computing (AWS, GCP, and Azure), TensorFlow, Keras, and Kubernetes. My preferred languages are Python, R, SQL, Rust, and Java, but I am also comfortable using other programming languages. I have included examples of recent projects in my portfolio to showcase my capabilities. In addition, I have developed a strong blockchain development skill set, including proficiency in JavaScript, Rust, and Solidity, using Truffle, Hardhat, and Alchemy. I specialize in smart contract development and auditing, and I am confident in my abilities to help clients develop and deploy effective blockchain solutions. As a medical device specialist, I have a wealth of experience in regulatory and quality control, as well as all areas of the product life cycle. I have worked on class 1, class 2, and class 3 devices in several areas, including software applications, dermatological, diagnostic biotech, diagnostic radiology, orthopedic, cardiac, urology, and physical therapy devices. I am skilled in completing regulatory documents such as design history folders, risk assessments, 510(k), DIOVV, quality plans, and verification/validation testing, and have included examples of these documents and public policy papers I have written in my portfolio. I offer services in data engineering, machine learning, medical device and overall software development, and I am confident that my skills and experience can deliver results that exceed your expectations. Please feel free to contact me to discuss your project and how I can help you achieve your goals. Let's connect and make your project a success.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Dashboard
    Data Visualization
    Data Mining
    ETL
    API Development
    Compliance Consultation
    Rust
    HIPAA
    Solidity
    Medical Device
    Blockchain
    Data Engineering
    Machine Learning
    Data Science
    SQL
  • US$100 hourly
    — TOP RATED PLUS Freelancer on UPWORK — EXPERT VETTED Freelancer (Among the Top 1% of Upwork Freelancers) — Full Stack Engineer — Data Engineer ✅ AWS Infrastructure, DevOps, AWS Architect, AWS Services (EC2, ECS, Fargate, S3, Lambda, DynamoDB, RDS, Elastic Beanstalk, AWS CDK, AWS Cloudformation etc.), Serverless application development, AWS Glue, AWS EMR Frontend Development: ✅ HTML, CSS, Bootstrap, Javascript, React, Angular Backend Development: ✅ JAVA, Spring Boot, Hibernate, JPA, Microservices, Express.js, Node.js Content Management: ✅ Wordpress, WIX, Squarespace Big Data: ✅ Apache Spark, ETL, Big data, MapReduce, Scala, HDFS, Hive, Apache NiFi Database: ✅ MySQL, Oracle, SQL Server, DynamoDB Build/Deploy: ✅ Maven, Gradle, Git, SVN, Jenkins, Quickbuild, Ansible, AWS Codepipeline, CircleCI As a highly skilled and experienced Lead Software Engineer, I bring a wealth of knowledge and expertise in the areas of Java, Spring, Spring Boot, Big Data, MapReduce, Spark, React, Graphics Design, Logo Design, Email Signatures, Flyers, Web Development (HTML, CSS, Bootstrap, JavaScript & frameworks, PHP, Laravel), responsive web page development, Wordpress and designing, and testing. With over 11 years of experience in the field, I have a deep understanding of Java, Spring Boot, and Microservices, as well as Java EE technologies such as JSP, JSF, Servlet, EJB, JMS, JDBC, and JPA. I am also well-versed in Spring technologies including MVC, IoC, security, boot, data, and transaction. I possess expertise in web services, including REST and SOAP, and am proficient in various web development frameworks such as WordPress, PHP, Laravel, and CodeIgniter. Additionally, I am highly skilled in Javascript, jQuery, ReactJs, AngularJs, Vue.Js, and Node. C#, ASP.NET MVC In the field of big data, I have experience working with MapReduce, Spark, Scala, HDFS, Hive, and Apache NiFi. I am also well-versed in cloud technologies such as PCF, Azure, and Docker. Furthermore, I am proficient in various databases including MySQL, SQL Server, MySql, and Oracle. I am familiar with different build tools such as Maven, Gradle, Git, SVN, Jenkins, Quickbuild, and Ansible.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Apache Spark
    Database
    WordPress
    Cloud Computing
    Spring Framework
    Data Engineering
    NoSQL Database
    React
    Serverless Stack
    Solution Architecture Consultation
    Spring Boot
    DevOps
    Microservice
    AWS Fargate
    AWS CloudFormation
    Java
    CI/CD
    Amazon ECS
    Containerization
  • US$50 hourly
    "She is very good in coding. She is the best and to go person for any hadoop or nifi requirements." "Abha is a star; have successfully handed the project in a very professional manner. I will definitely be working with Abha again; I am very happy with the quality of the work. 🙏" "Abha Kabra is one of the most talented programmers I have ever meet in Upwork. Her communication was top-notch, she met all deadlines, a skilled developer and super fast on any task was given to her. Perfect work is done. Would re-hire and highly recommended!!" Highly skilled and experienced Bigdata engineer with over 5 years of experience in the field. With a strong background in Analysis, Design, and Development of Big Data and Hadoop based Projects using technologies like following: ✅ Apache spark with Scala & python ✅ Apache NiFi ✅ Apache Kafka ✅ Apache Airflow ✅ ElasticSearch ✅ Logstash ✅ Kibana ✅ Mongodb ✅ Grafana ✅ Azure data factory ✅ Azure pipelines ✅ Azure databricks ✅ AWS EMR ✅ AWS S3 ✅ AWS Glue ✅ AWS Lambda ✅ GCP ✅ cloud functions ✅ PostgreSql ✅ MySql ✅ Oracle ✅ MongoDB ✅ Ansible ✅ Terraform ✅ Logo/Book Cover Design ✅ Technical Blog writing A proven track record of delivering high-quality work that meets or exceeds client expectations. Deep understanding of Energy-Related data, IoT devices, Hospitality industry, Retail Market, Ad-tech, Data encryptions-related projects, and has worked with a wide range of clients, from Marriott, P&G, Vodafone UK, eXate UK etc. Able to quickly understand client requirements and develop tailored solutions that address their unique needs. Very communicative and responsive, ensuring that clients are kept informed every step of the way. A quick learner and is always eager to explore new technologies and techniques to better serve clients. Familiar with Agile Methodology, Active participation in Daily Scrum meetings, Sprint meetings, and retrospective meetings, know about working in all the phases of the project life cycle. A strong team player and a leader with good interpersonal and communication skills and ready to take independent challenges.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    PySpark
    Databricks Platform
    ETL Pipeline
    Big Data
    Grafana
    Kibana
    Apache Kafka
    Apache Spark
    PostgreSQL
    Microsoft Azure
    MongoDB
    Scala
    Python
    Elasticsearch
    Google Cloud Platform
    Amazon Web Services
  • US$20 hourly
    Experienced Software Engineer with a demonstrated history of working in the information technology and services industry. Skilled in Java, Spring Boot, DevOps, Jenkins, Ansible Eureka, React, and groovy
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Docker
    Linux
    Apache Spark MLlib
    DevOps
    Ansible
    Apache Hadoop
    Big Data
    Apache Spark
    Elasticsearch
    Python
    Cloud Computing
    JavaScript
    Java
  • US$35 hourly
    I am a data engineer expert with over than 5 years experience in data ingestion, integration and manipulation. Till date, I have done many projects in data engineering and big data. I worked on business analytics and telco analytics, i used multy data platforms and framework such as Cloudera data platform, Nifi, R studio, Spark, Hadoop, Kafka ... If this is what you want, then get in touch with me
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Cloud Engineering
    Cloudera
    Apache Hadoop
    Data Warehousing
    Linux
    Apache Spark
    Data Lake
    Data Analysis
    SQL
    Big Data
    Business Intelligence
    Scala
    Apache Hive
    Python
  • US$70 hourly
    🎓 𝗖𝗲𝗿𝘁𝗶𝗳𝗶𝗲𝗱 𝗗𝗮𝘁𝗮 𝗣𝗿𝗼𝗳𝗲𝘀𝘀𝗶𝗼𝗻𝗮𝗹 with 𝟲+ 𝘆𝗲𝗮𝗿𝘀 of experience and hands-on expertise in Designing and Implementing Data Solutions. 🔥 4+ Startup Tech Partnerships ⭐️ 100% Job Success Score 🏆 In the top 3% of all Upwork freelancers with Top Rated Plus 🏆 ✅ Excellent communication skills and fluent English If you’re reading my profile, you’ve got a challenge you need to solve and you are looking for someone with a broad skill set, minimal oversight and ownership mentality, then I’m your go-to expert. 📞 Connect with me today and let's discuss how we can turn your ideas into reality with creative and strategic partnership.📞 ⚡️Invite me to your job on Upwork to schedule a complimentary consultation call to discuss in detail the value and strength I can bring to your business, and how we can create a tailored solution for your exact needs. 𝙄 𝙝𝙖𝙫𝙚 𝙚𝙭𝙥𝙚𝙧𝙞𝙚𝙣𝙘𝙚 𝙞𝙣 𝙩𝙝𝙚 𝙛𝙤𝙡𝙡𝙤𝙬𝙞𝙣𝙜 𝙖𝙧𝙚𝙖𝙨, 𝙩𝙤𝙤𝙡𝙨 𝙖𝙣𝙙 𝙩𝙚𝙘𝙝𝙣𝙤𝙡𝙤𝙜𝙞𝙚𝙨: ► BIG DATA & DATA ENGINEERING Apache Spark, Hadoop, MapReduce, YARN, Pig, Hive, Kudu, HBase, Impala, Delta Lake, Oozie, NiFi, Kafka, Airflow, Kylin, Druid, Flink, Presto, Drill, Phoenix, Ambari, Ranger, Cloudera Manager, Zookeeper, Spark-Streaming, Streamsets, Snowflake ► CLOUD AWS -- EC2, S3, RDS, EMR, Redshift, Lambda, VPC, DynamoDB, Athena, Kinesis, Glue GCP -- BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Data Fusion Azure -- Data Factory, Synapse. HDInsight ► ANALYTICS, BI & DATA VISUALIZATION Tableau, Power BI, SSAS, SSMS, Superset, Grafana, Looker ► DATABASE SQL, NoSQL, Oracle, SQL Server, MySQL, PostgreSQL, MongoDB, PL/SQL, HBase, Cassandra ► OTHER SKILLS & TOOLS Docker, Kubernetes, Ansible, Pentaho, Python, Scala, Java, C, C++, C# 𝙒𝙝𝙚𝙣 𝙮𝙤𝙪 𝙝𝙞𝙧𝙚 𝙢𝙚, 𝙮𝙤𝙪 𝙘𝙖𝙣 𝙚𝙭𝙥𝙚𝙘𝙩: 🔸 Outstanding results and service 🔸 High-quality output on time, every time 🔸 Strong communication 🔸 Regular & ongoing updates Your complete satisfaction is what I aim for, so the job is not complete until you are satisfied! Whether you are a 𝗦𝘁𝗮𝗿𝘁𝘂𝗽, 𝗘𝘀𝘁𝗮𝗯𝗹𝗶𝘀𝗵𝗲𝗱 𝗕𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗼𝗿 𝗹𝗼𝗼𝗸𝗶𝗻𝗴 𝗳𝗼𝗿 your next 𝗠𝗩𝗣, you will get 𝗛𝗶𝗴𝗵-𝗤𝘂𝗮𝗹𝗶𝘁𝘆 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀 at an 𝗔𝗳𝗳𝗼𝗿𝗱𝗮𝗯𝗹𝗲 𝗖𝗼𝘀𝘁, 𝗚𝘂𝗮𝗿𝗮𝗻𝘁𝗲𝗲𝗱. I hope you become one of my many happy clients. Reach out by inviting me to your project. I look forward to it! All the best, Anas ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Muhammad is really great with AWS services and knows how to optimize each so that it runs at peak performance while also minimizing costs. Highly recommended! ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ You would be silly not to hire Anas, he is fantastic at data visualizations and data transformation. ❞ 🗣❝ Incredibly talented data architect, the results thus far have exceeded our expectations and we will continue to use Anas for our data projects. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ The skills and expertise of Anas exceeded my expectations. The job was delivered ahead of schedule. He was enthusiastic and professional and went the extra mile to make sure the job was completed to our liking with the tech that we were already using. I enjoyed working with him and will be reaching out for any additional help in the future. I would definitely recommend Anas as an expert resource. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Muhammad was a great resource and did more than expected! I loved his communication skills and always kept me up to date. I would definitely rehire again. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Anas is simply the best person I have ever come across. Apart from being an exceptional tech genius, he is a man of utmost stature. We blasted off with our startup, high on dreams and code. We were mere steps from the MVP. Then, pandemic crash. Team bailed, funding dried up. Me and my partner were stranded and dread gnawed at us. A hefty chunk of cash, Anas and his team's livelihood, hung in the balance, It felt like a betrayal. We scheduled a meeting with Anas to let him know we were quitting and request to repay him gradually over a year, he heard us out. Then, something magical happened. A smile. "Forget it," he said, not a flicker of doubt in his voice. "The project matters. Let's make it happen!" We were floored. This guy, owed a small fortune, just waved it away? Not only that, he offered to keep building, even pulled his team in to replace our vanished crew. As he spoke, his passion was a spark that reignited us. He believed. In us. In our dream. In what he had developed so far. That's the day Anas became our partner. Not just a contractor, but a brother in arms. Our success story owes its spark not to our own leap of faith, but from the guy who had every reason to walk away. Thanks, Anas, for believing when we couldn't.❞
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Solution Architecture Consultation
    AWS Lambda
    ETL Pipeline
    Data Management
    Data Warehousing
    AWS Glue
    Apache Spark
    Amazon Redshift
    ETL
    Python
    SQL
    Marketing Analytics
    Big Data
    Data Visualization
    Artificial Intelligence
  • US$85 hourly
    - 15 years of experience in Data Science, Data warehouse, Business Intelligence, advanced analytics, ETL/ELT, Data visualization, Virtualization, database programming and data engineering. - Experience in Machine learning, especially on customer360, linear regression and decision trees. - Specialized in the end to end of Business Intelligence and analytics implementations - ER (Entity Relationship) modeling for OLTP and dimension modeling (conceptual, Logical, Physical) for OLAP. - Have experience running startup companies and building SaaS products, including CRM (Customer Relationship Management) and Data Orchestration low-code tools. - Experience working in Agile Scrum and methodologies (2 and 3-week sprints) - Excellent communication skills, a good understanding of business and client requirements - Good at technical documentation, POCs (Proof of Concepts) - Good at discussions with Stakeholders for requirements and Demos - Convert business requirements into Technical design documents with pseudo-code - Dedicated, work with minimal supervision. - Eager to learn new technologies, can explore and learn and develop quickly on client-owned applications. - Expert in SQL, T-SQL, PLSQL, knows advanced functions and features of it, good at database programming. - good at Performance Tuning, clustering, Indexing, Partitioning and other DBA activities. - DBA activities like Database backup/recovery, monitoring Database health, killing Long-running queries and suggesting better tuning options. - Good at database programming and normalized techniques (all 3 normal forms). - Expert in Azure Synapse, PostgreSQL, MongoDB, Dynamo DB, Google Data Studio, Tableau, Sisense, SSRS, SSIS, and more. - Domain knowledge in Telecom, Finance/Banking, Automobile, Insurance, Telemedicine, Healthcare and Virtual Clinical Trials (CT). - Extensive DBA knowledge and work experience in SQL Server, Login management, database backup and restore, monitoring database loads, and tuning methods. - Exceptionally well in Azure ML and regression models Expertise: Database: Snowflake, Oracle SQL and PLSQL (OCP certified), SQL Server, T-SQL, SAP HANA, Azure SQL Database, Azure Synapse Analytics, Teradata, Mysql, No SQL, PostgreSQL, and MongoDB ETL: Azure Data Factory, DBT, SSIS, AWS Glue, Matillion CDC & ETL, Google Big Query, Informatica PC and Cloud, ODI, Data Stage, MSBI (SSIS, SSAS) Reporting/Visualization: Sisense, QlikSense, Sigma Computing, Metabase, Qlikview, SSRS, Domo, Looker, Tableau, Google Data Studio, Amazon QuickSight and PowerBI Scripting Language: Unix, Python and R Cloud Services: Google Cloud Platform (Big Query, Cloud functions, Data Studio), MS Azure (Azure Blob Storage, Azure Functional Apps, Logic Apps, Azure Data Lakehouse, Databricks, Purview, ADF and Microservices), Azure ML, AWS RDS EC2, S3, and Amazon Redshift, Step functions, Data Pipelines Data Virtualization: Denodo
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    C#
    Snowflake
    ETL
    Data Warehousing
    Business Intelligence
    Data Visualization
    Azure Machine Learning
    Qlik Sense
    Looker
    Sisense
    Microsoft Power BI
    SQL
    Tableau
  • US$40 hourly
    I am a highly experienced data science freelancer with over 20+ years of experience in the field. Throughout my career, I have developed a deep understanding of the principles and techniques of data science and have applied this knowledge to a wide range of projects. With a strong background in data analysis, machine learning, deep learning, and statistical modeling, I am able to quickly and accurately extract insights from large and complex datasets. My expertise in programming languages such as Python, R, and SQL enables me to implement these insights in a scalable and efficient manner. In addition to my technical skills, I have a passion for using data science to drive business results. I have worked with a wide range of organizations, from startups to large corporations, and have helped them to use data to inform their decision-making, optimize their operations, and achieve their strategic goals. I am a self-starter and have a strong work ethic. I am able to work independently or as part of a team and I have excellent communication skills, which allow me to effectively collaborate with stakeholders at all levels of an organization.   I have worked on various engagements across multiple domains for solving numerous problem statements including: Telecom:  -Worked on Churn use case for the largest cellular company in the United States, analyzed their customer survey data, feedbacks, social media data to determine the customer experience and customer sentiments, demand identification and forecasting based on customer service records and customer engagement in different services, recommendation and planning of the marketing campaigns, offers and new personalized packs for customers based on customer history. Automotive: -Designed and deployed an automated application for a leading automotive testing company through which the client can see the machine failure prediction before time. The application also visualizes the predicted parameter values over time with graphs, data tables, and also the possible causes and remedies regarding the errors.   -Worked on Paint shop defects analysis for the World’s largest car manufacturer BFSI: -Worked with asset management firms, investment management organizations, and developed solutions for fraud detection, fraud prediction, and credit risk analysis, stock prediction, investment planning, investment portfolio analytics. -Developed an accurate Cryptocurrency prediction model that can predict the hike and drop/crash of various coins, based on the historical data, social media data and market research data. NLP/Chatbots: -Developed Real-time Chatbot applications with . -Developed Custom NER Model for entity recognition. Logistics: -Conceptualized and implemented Route optimization algorithms for Transportation and Logistics companies, for identifying the best route and used a variety of customer data feeds and a complex optimization algorithm to compute and recommend the best route for fuel savings. Healthcare: -Developed and optimized novel deep learning-based approaches to automate many aspects of medicine, including; disease diagnosis, and preventative medicine. -Created a deep learning neural network that can process the MRI, PET-FDG, Amyloid and Tau image data from the ADNI database and developed a classification and prediction model to predict the Alzheimer's Disease.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Operations Research
    Artificial Intelligence
    Data Visualization
    Data Analysis
    Statistical Analysis
    Optimization Modeling
    AWS Application
    Large Language Model
    Generative Model
    Natural Language Processing
    Deep Learning
    Machine Learning
    Chatbot
    Python
    Data Science
  • US$25 hourly
    PROGRAMMING TECHNOLOGY EXPERTISE * Python ,Django, FastAPI,Flask,Selenium,Rest API * React.js, Next.js, Vue.js ,Angular * React Native * Flutter DEVOPS & CLOUD & CYBER SECURITY EXPERTISE * AWS Cloud Solution design and developmpent * Opensearch , Elasticsearch, Kibana, Logstash based setup, configuration and development integration * Ansible * Docker * Jenkins * GitLab Based CI-CD * Prometheus and grafana * SIEM * Surikata/Snort * Bro(zeek) * Hashicorp vault * Cyber Security Project Related development and consultation. * Kong api gateway integration
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Amazon Elastic Beanstalk
    Flutter
    React Native
    PostgreSQL Programming
    ELK Stack
    AWS CloudFront
    Amazon S3
    RESTful API
    AWS Lambda
    DevOps
    Next.js
    React
    Python
    Django
    AWS Amplify
  • US$40 hourly
    Seeking for challenging task in design and development of scalable backend infrastructure solutions I work in following domains. 1. ETL Pipelines 2. Data Engieering 3. DevOps & AWS (Amazon Web Services) and GCP (Google Cloud Development) deployment 4. Machine Learning I mainly design all solutions in Python. I have 10+ years of experience in Python. I have extensive experience in following frameworks/libraries - Flask, Django, Pandas, Numpy, Django, PyTorch, Scrapy and many more Regarding ETL Pipelines, I mainly provide end to end data pipelines using AWS/GCP/Custom Frameworks. I have more than 7+ years of experience in this domain. I have strong command in Scrapy and have done more than 300+ crawlers till date. Regarding Data Warehousing, I have extensive experience in Google BigQuery and AWS RedShift. I have hands on experience in handling millions of data and analyze them using GCP and AWS data warehousing solutions. I have 5+ years of experience in designing Serverless Applications using AWS and GCP. In addition to this, I am hands on bunch of services on GCP and AWS Cloud and provide efficient and cost effective solution over there.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Data Analysis
    Apache Spark
    PySpark
    ChatGPT
    Generative AI
    AWS Glue
    Google Cloud Platform
    BigQuery
    Snowflake
    Kubernetes
    Django
    Docker
    Serverless Stack
    Python
    Scrapy
    Data Scraping
    ETL Pipeline
  • US$40 hourly
    Hi, I am Isha Taneja, highly skilled in Data Analytics, Engineering & Cloud Computing from Mohali, India. I am an expert in creating an ETL data flow in Talend Studio, Databricks & Python using best design patterns and practices to integrate data from multiple data sources.  I have worked on multiple projects which require Data migration Data Warehousing development and API Integration. Expertise: 1. Migration - Platform Migration - Legacy ETL to Modern Data Pipeline / Talend / ERP Migration / CRM Migration - Data Migration - Salesforce migration / Hubspot Migration / Cloud Migration / ERP Migration 2. Data Analytics - Data Lake Consulting - Data Warehouse Consulting - Data Modelling / Data Integration / Data Governance / ETL - Data Strategy - Data Compliance / Data Deduplication / Data Reconciliation / Customized Data - Processing Framework / Data Streaming / API implementation / Data Ops - Business Intelligence - Digital marketing Analysis / E-commerce Analytics / ERP Reporting Capabilities - Big Data - Lakehouse Implementation 3. Software QA & Testing 4. Custom Application Development - UI/UX - Frontend Development - Backend Development 5. Cloud - Cloud-native Services/ AWS Consulting / Cloud Migration /Azure Consulting / Databricks /Salesforce 6. Business process automation - Bi-directional sync between applications / RPA A Data Professional and a ETL Developer with 10+ years of experience working with enterprises/clients globally to define their implementation approach with the right Data platform strategy, Data Analytics, and business intelligence solutions. My domain expertise lies in E-Commerce, Healthcare, HR Related, Media & Advertising, Digital Marketing You have the data? Great !! I can help you analyze it using Python. It involves performing exploratory data analysis, hypothesis testing, and data visualization. You have Big Data? Even Better !! I can help you clean, transform, store and analyze it using big data technologies and production it using cloud services like AWS and Azure Cloud. You want to track business KPIs and metrics? No Problem !! I can even help you develop reports using Tableau and PowerBI; this will always keep you ahead in your business. Specialities: Databases: Snowflakes, Postgres, Dynamo DB, Graph DB - Neo4j, Mongo DB, Data Warehouse concepts, MSSQL ETL-Tools: Talend Data Integration Suite, Matillion, Informatica, Databricks API Integration - Salesforce / Google Adwords / Google Analytics / Marketo / Amazon MWS - Seller Central / Shopify / Hubspot / FreshDesk / Xero Programming: Java, SQL, HTML, Unix, Python, Node JS, React JS Reporting Tools: Yellowfin BI, Tableau, Power BI, SAP BO, Sisense, Google Data Studio AWS Platform: S3, AWS Lambda, AWS Batch, ECS, EC2, Athena, AWS Glue, AWS Step Functions Azure Cloud Platform. Other Tools: Airflow Expect Integrity, Excellent communication in English, technical proficiency, and long-term support.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Databricks MLflow
    Databricks Platform
    Tableau
    Microsoft Power BI
    Data Extraction
    Talend Data Integration
    Data Analysis
    Microsoft Azure
    Continuous Integration
    AWS Lambda
    API
    Database
    Python
    SQL
    ETL
  • US$20 hourly
    Cloud Data Architect with 10+ Year of experience * Programming: Scala, Java, Python * Web Scraping: Selenium + Beautifulsoap * Big data Technology: Hadoop, Spark, Hive, Impala, HBase * Streaming Technology: Kafka, Nifi , Spark Streaming, Kafka-Connect, Kafka Streaming, Kafka SQL, Kafka Rest Proxy, IBM MQ, Kafka Monitoring * Reporting: Tableau, Kibana , Grafana * DB Technologies: Teradata, Greenplum, SQL Server, MySQL, Mongo DB, ElasticSearch (ELK) * Accomplishments: - Implementing a data warehousing solution that enables fast and accurate retrieval of data for business intelligence and analytics. - Developing and deploying data analytics and machine learning models in production. - Gold Medalist
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    AWS Lambda
    ETL Pipeline
    Hive
    Apache Druid
    Data Engineering
    Amazon Redshift
    Kubernetes
    AWS Glue
    Apache Hadoop
    Elasticsearch
    SQL
    Apache Spark
    Apache Kafka
    Python
  • US$35 hourly
    I'm senior software engineer with very good knowledge of Java, and Business process management. the following list will show my knowledge areas : 1-Business Process Management (BPM). - (Camunda), (Activiti), (JBPM), (Bonita). 2-Business Process Model and Notation (BPMN). 3-Process Modeling and Workflow Design. 4- software engineering - OOP, Design patterns, Agile, Scrum 5- JAVA stack -Java, java EE, JDBC, JPA, Hibernate, -Spring framework including (IOC container, MVC, AOP, security, data , REST and spring Boot, JDBC Template) 6-DATABASE -Oracle(SQL, PLSQL) -SQL server -Mysql
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Business Process Modeling
    jBPM
    Business Process Management
    Project Workflows
    Process Modeling
    Business Process Model & Notation
    Bonita
    Spring Boot
    SQL
    Hibernate
  • US$35 hourly
    I have over 12 years of experience in the information technology industry. Worked in object oriented programming using Java and PHP programming languages. Know well relation (MySQL, PostgreSQL) and NoSQL (MongoDB, CouchDB) databases. Can manage system administrator tasks as needed on Linux based platforms. Last few years I've worked as java developer on J2EE Client-servers projects. Before that two years worked as a systems administrator in the internet service provider company.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Core Java
    Apache Tomcat
    JDBC
    Ext JS
    Apache Struts
    Spring Framework
    jQuery
    AJAX
    HTML
    CSS
    Hibernate
    Jakarta Server Pages
    JavaScript
    SQL
    Java
  • US$35 hourly
    I have 18+ years of experience in software development in Telecom, Banking, and Healthcare domains. Primary skillsets include Big Data eco-systems (Apache Spark, Hive, Map Reduce, Cassandra), Scala, Core Java, Python, C++. I am well versed in designing and implementing Big data solutions, ETL and Data Pipelines, Serverless and event-driven architectures on Google Cloud Platform (GCP), and Cloudera Hadoop 5.5. I like to work with organizations to develop sustainable, scalable, and modern data-oriented software systems. - Keen eye on scalability, sustainability of the solution - Can come up with maintainable & good object-oriented designs quickly - Highly experienced in seamlessly working with remote teams effectively - Aptitude for recognizing business requirements and solving the root cause of the problem - Can quickly learn new technologies Sound experience in following technology stacks: Big Data: Apache Spark, Spark Streaming, HDFS, Hadoop MR, Hive, Apache Kafka, Cassandra, Google Cloud Platform (Dataproc, Cloud storage, Cloud Function, Data Store, Pub/Sub), Clouder Hadoop 5.x Languages: Scala, Python, Java, C++, C Build Tools: Sbt, Maven Databases: Postgres, Oracle Worked with different types of Input & Storage formats: CSV, XML, JSON file, Mongodb, Parquet, ORC
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    C++
    Java
    Apache Spark
    Scala
    Apache Hadoop
    Python
    Apache Cassandra
    Oracle PLSQL
    Apache Hive
    Cloudera
    Google Cloud Platform
  • US$110 hourly
    Distributed Computing: Apache Spark, Flink, Beam, Hadoop, Dask Cloud Computing: GCP (BigQuery, DataProc, GFS, Dataflow, Pub/Sub), AWS EMR/EC2 Containerization Tools: Docker, Kubernetes Databases: MongoDB, Postgres-XL, PostgreSQL Languages: Java, Python, C/C++
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    MapReduce
    Apache Kafka
    Cloud Computing
    Apache Hadoop
    White Paper Writing
    Academic Writing
    Google Cloud Platform
    Dask
    Apache Spark
    Research Paper Writing
    Apache Flink
    Kubernetes
    Python
    Java
  • US$110 hourly
    Having an expert who is easy to talk to is the best way to save you headaches, time, and money. Instead of spending hours sorting through documentation, creating prototypes that are destined to fail, you can talk to me to resolve all your Elasticsearch issues at a fraction of the cost. That way you can focus on creating value rather than being stuck in the trenches. Need to organize, understand and visualize your large amounts of streaming data on clean, easy-to-use dashboards for one or more users? I'm an Elasticsearch, Logstash and Kibana expert here to solve your problem. My focus and expertise with the ELK stack and I've worked with a variety of projects implementing it for small one person teams, and for large enterprises. If you want a competent engineer that will first understand YOUR needs and second create a solution that directly addresses those needs, I'm the engineer for you. I ask a lot of questions to make sure I understand what your needs are so you don't need to waste time looking at solutions that aren't what you need. I've always spent time explaining things to clients in a clear way so that you can fully understand what I'm doing so that you can take over and solve problems if you need to. I work efficiently, learn fast and create things with integrity. Are you a social enterprise with a noble cause? Send me a message, I'd like to talk. I offer discounted rates for particular projects. Some bullet points on a few implementations I've worked on: Fully architected, designed, and implemented the ELK stack many times, organizing everything from healthcare data to solar panel grids to difficult-to-parse error logs Created relevant and beautiful custom dashboards that teams use to visualize large data sets, and monitor the health of apps/websites. I've also developed custom plugins that interact and manipulate the data in meaningful ways (3rd party integration with Jira, Twitter, etc..), and can install any custom Kibana plugins available on Github (and Vega). Managed and automated the cluster installation and node health on multiple machines. Created and optimized Logstash config files with many different inputs, filters and outputs. For critical errors, email escalation. I've played with it all. Filebeat, metricbeat, packetbeat and other beats are all within my scope of work. I can create a search engine that does what you want it to. I can explain things in layman's terms and not fill an explanation of how something works with technical jargon. I will be clear with my expectations and love to explain how things work so you can do it yourself. Let's work on your project together so that you can get what you want out of your data. Additional Competencies: API integration with ELK: Facebook, Google Analytics, Twitter, etc... Automation of ELK: Bash and Ansible External Database Syncing with ELK: SQL, MongoDB Misc: Readonly Rest, Own_home, Searchguard, Graphite, alerting, siem, watcher
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Elasticsearch
    Data Ingestion
    Kibana
    Data Visualization
    Logstash
    CentOS
    Data Analysis
  • US$30 hourly
    I'm a Tech Data Architect / Sr. Data and Systems Engineer working for 4+ Years, recognized as Top Performer for the 4-Consecutive Years. I am talented at solving problems with a passion for data and extensive experience across a broad skill set: * Certified all through till Splunk Admin & Data Admin Certificate. * Expert in Splunk (Architecture / Admin & Data Admin / Configuration / Development / Deployments / Upgrades) for Splunk distributed environments (Indexer Clusters / SHCs / Multisite). * Expert in developing custom Alert Actions & Custom Commands in Splunk. * Expert in developing advanced Queries & Custom dashboards on Splunk using JS, CSS, Advanced XML. * Expert in Apache Nifi as an Admin & Developer. (Apache Nifi Cluster Installation, Upgrade, orchestration using CloudBreak) * Expert in Cribl as an Admin & Developer. * Expert in GCP GCS, BQ, Cloud Functions, Data Fusion and Data Flow. * Competent across the following cloud platforms - AWS and GCP. * Good at Splunk ITSI * Good at Splunk Google Drive APIs * Good at Splunk Google Sheets APIs * Good at Google Data Studio * Good at MongoDB Python Development * Good at setting engineering standards & governance frameworks * Good knowledge of designing & building data pipelines * Good knowledge of Automation techniques * Good knowledge of SQL (Oracle, Postgres, MySQL, SQL Server), PL/SQL / PL/pgSQL, Data Modelling, Data Warehousing, query optimization. * Good knowledge of Python, bash & regular expressions * Familiar with Azure Devops Pipelines , Kubernetes, Docker, orchestration tools (Airflow) * Also familiar with many SOAP / REST integrations with various modules and services. I can help you with: ✔ Performance tuning for SQL databases ✔ MySQL / MariaDB / Postgres / Oracle / BigQuery / MongoDB / other SQL ✔ Python ✔ Linux and shell scripting ✔ Server configuration ✔ Cloud platforms (Google Cloud, Amazon AWS) ✔ Data analytics and pipelines ✔ Data Pipeline Monitoring Framework for Data & Platform Planes ✔ API Pipeline ingestion ✔ Data Cleaning & Management ✔ ETL Data Pipeline Creation ✔ High Speed Data Extraction & Scraping ✔ Custom & Interactive Dashboards ✔ Data Lake & Data Warehouse setup ✔ Data Analysis I look forward to speaking with you.
    vsuc_fltilesrefresh_TrophyIcon Apache NiFi
    Splunk
    Data Cleaning
    Google Cloud Platform
    PostgreSQL
    MySQL
    MongoDB
    Data Ingestion
    Looker Studio
    API Integration
    ETL
    BigQuery
    Data Extraction
    Data Analysis
    Python
  • Want to browse more freelancers?
    Sign up

How it works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How to Hire Top Apache NiFi Specialists

How to hire Apache NiFi experts

Knowledge is power, and the key to unlocking this power lies in your ability to track and manage the flow of data within your business or organization. An Apache NiFi expert can help you set up the data routing, transformation, and system mediation logic you need to master your data. 

So how do you hire Apache NiFi experts? What follows are some tips for finding top Apache NiFi experts on Upwork. 

How to shortlist Apache NiFi consultants

As you’re browsing available Apache NiFi consultants, it can be helpful to develop a shortlist of the independent professionals you may want to interview. You can screen profiles on criteria such as:

  • Technology fit. You want an Apache NiFi expert who understands the technologies behind your products and services so they can design custom dataflow solutions for your business. 
  • Workflow. You want an Apache NiFi expert who can slide right into your existing developer workflow (e.g., Jira, Slack).
  • Feedback. Check reviews from past clients for glowing testimonials or red flags that can tell you what it’s like to work with a particular Apache NiFi expert.

How to write an effective Apache NiFi job post

With a clear picture of your ideal Apache NiFi expert in mind, it’s time to write that job post. Although you don’t need a full job description as you would when hiring an employee, aim to provide enough detail for a consultant to know if they’re the right fit for the project. 

An effective Apache NiFi job post should include: 

  • Scope of work: From tracking dataflows to creating loss-tolerant data delivery systems, list all the deliverables you’ll need. 
  • Project length: Your job post should indicate whether this is a smaller or larger project. 
  • Background: If you prefer experience working with certain industries, software, or technologies, mention this here. 
  • Budget: Set a budget and note your preference for hourly rates vs. fixed-price contracts.

Ready to automate data flow within your organization? Log in and post your Apache NiFi job on Upwork today.

APACHE NIFI SPECIALISTS FAQ

What is Apache NiFi?

Apache NiFi provides data scientists and engineers with a web-based user interface for designing and monitoring dataflows within an organization. Apache NiFi experts can automate many of the configuration and data processing tasks associated with moving data from one place to another.

Here’s a quick overview of the skills you should look for in Apache NiFi professionals:

  • Apache NiFi
  • Data science and/or data engineering
  • Big data tech such as Hadoop, Spark, and AWS
  • Back-end languages such as Python and SQL

Why hire Apache NiFi experts?

The trick to finding top Apache NiFi experts is to identify your needs. Are you looking to connect raw marketing data logs to an Amazon Kinesis Data Firehose end point for real-time marketing analytics? Or do you need help directing data from a fleet of IoT devices to your SaaS platform? 

The cost of your project will depend largely on your scope of work and the specific skills needed to bring your project to life. 

How much does it cost to hire an Apache NiFi consultant?

Rates can vary due to many factors, including expertise and experience, location, and market conditions.

  • An experienced Apache NiFi consultant may command higher fees but also work faster, have more-specialized areas of expertise, and deliver higher-quality work.
  • A consultant who is still in the process of building a client base may price their Apache NiFi services more competitively. 

Which one is right for you will depend on the specifics of your project.

View less
Schedule a call