Hire the best Apache NiFi Developers in India
Check out Apache NiFi Developers in India with the skills you need for your next job.
- $40 hourly
- 5.0/5
- (9 jobs)
Senior Big Data Engineer | Solution Architect | Hadoop | Spark | AWS | Azure | Python | Scala 🚀 Top-Rated Data Engineer with 7+ Years Experience | 100% Job Success Rate Specialized in Big Data Solutions, Data Pipelines, and Cloud Architecture What I Deliver: - Big Data Processing: Apache Spark, Hadoop, Kafka, NiFi data pipelines - Cloud Solutions: AWS (EMR, S3, Glue, Lambda, Redshift), Azure (Data Factory, Databricks, Synapse, Pipelines, DevOps, Artifactory, Storage), GCP(Cloud Function, storage, DataFlow) - Data Engineering: ETL/ELT pipelines, real-time streaming, data warehousing, data lake - Database Management: PostgreSQL, MySQL, MongoDB, Oracle - DevOps & Automation: Terraform, Ansible, CI/CD pipelines Recent Client Testimonials: "Abha is a star; have successfully handled the project in a very professional manner. I will definitely be working with Abha again; I am very happy with the quality of the work. 🙏" "One of the most talented programmers I have ever met on Upwork. Her communication was top-notch, she met all deadlines, a skilled developer and super fast on any task. Perfect work is done. Would re-hire and highly recommended!!" Why Choose Me: ✅ Fortune 500 Experience: Delivered solutions for Marriott, P&G, Vodafone UK, eXate UK ✅ Industry Expertise: Energy, IoT, Hospitality, Retail, Ad-tech, Data Security ✅ Fast Delivery: Quick turnaround without compromising quality ✅ 24/7 Communication: Regular updates and transparent project management ✅ Agile Methodology: Scrum, Sprint planning, daily standups Core Technologies: Big Data: Apache Spark (Scala/Python) | Hadoop | Kafka | NiFi | Airflow Cloud Platforms: AWS | Azure | Google Cloud Platform Databases: PostgreSQL | MySQL | MongoDB | Oracle | ElasticSearch Visualization: Kibana | Grafana | Power BI DevOps: Docker | Terraform | Ansible | Jenkins Services Offered: - Data Pipeline Development & Optimization - Cloud Migration & Architecture - Real-time Data Streaming Solutions - Data Warehouse Design & Implementation - Performance Tuning & Optimization - Technical Documentation & Training Ready to transform your data challenges into business opportunities. Let's discuss your project requirements today! Available for both short-term projects and long-term partnerships. Flexible with time zones and committed to delivering exceptional results.Apache NiFi
PySparkDatabricks PlatformETL PipelineBig DataGrafanaKibanaApache KafkaApache SparkPostgreSQLMicrosoft AzureMongoDBScalaPythonElasticsearchGoogle Cloud PlatformAmazon Web Services - $35 hourly
- 5.0/5
- (11 jobs)
Shaikh is an experienced Certified Cloud Data Engineer with over three years of expertise in designing end-to-end ETL pipelines. He is passionate about unlocking the value of data and believes in its power to drive business growth. His skills are rooted in his experience working with Google Cloud Platform (GCP). Shaikh can help you leverage GCP services such as BigQuery, Bigtable, Data Studio (now Looker Studio), Cloud Functions, Cloud Storage, Cloud Scheduler, Scheduled Queries, Cloud SQL, Dataflow, Datafusion, and more. His expertise can empower your organization to efficiently manage and analyze large datasets, improve data-driven decision-making, and derive valuable insights.Apache NiFi
ETL PipelineApache BeamGoogle AnalyticsMicrosoft PowerPointBigQueryDatabricks PlatformGoogle Cloud PlatformSnowflakeApache SparkSQLGoogle SheetsPython - $20 hourly
- 3.4/5
- (81 jobs)
✓ Technology executive specializing in architecting and implementing highly scalable solutions to drive brand awareness, increase revenues, optimize productivity and improve margins. ✓ Overseeing the data, security, maintenance, and network for a company. ✓ Implementing the businesses’ technical strategy and managing the overall technology roadmap of the business. ✓ involved with talent acquisition and its onboarding, training, and management of Project Manager, product Manager, Developers, Devops, Designers. ✓ Setting the technical strategy for the company to enable it to achieve its goals. ✓ Seeking out the current and future technology that will drive the company’s success. ✓ Focus on strategic alignment of technology goals to organizational vision. ✓ Passionately committed to technology team development, empowering people to accomplish their goals and coaching them realize their individual potential. ✓ Proven track-record of success in technology product development, cloud infrastructure,Building Data Platforms, ETL Pipelines, Streaming Pipelines ,e-commerce, CRM ,mobile strategy, and social media integration. I am working from last 8 years with Apache Spark, Lucene, ElasticSeach/Kibana, Amazon EC2.RDBM's(SQL, MySQL, Aurora, PSQL, Oracle), NoSQL engines (Hadoop/HBase, Cassandra,DynamoDB, MongoDB),GraphDB(Neo4j, Neptune) in-memory databases (Hazelcast, GridGain), Apache Spark/MLib, Weka, Kafka, clustered file systems, general-purpose computing on GPU. Deploying the ML DL Models on GPU Instances(Nvidia). Have a great experience in query optimization, application profiling, and troubleshooting. My area of expertise includes: - Python script - Jira,Trello,Azure DevOps - Web scraping - AWS(Redshift, Glue, ECS, EC2, EMR, Kinesis, S3, RDS, VPC, IAM,DMS) - GCP(Big Query, DataFlow, SnowFlow) - Microsoft Azure - Hadoop Big Data - Elasticsearch/Kibana/Logtash(ELK) - Hadoop setup on standalone, Cloudera, and HortonWorks. - SQL like MySQL PostgreSQL - NoSql Database like Hbase and MongoDB - Machine learning - Deep Learning - Spark with Mlib,GraphX - Sphinx - Memcache - MS BI/Tableau/GDSApache NiFi
Big DataKibanaApache CassandraAWS CodeDeployMongoDBGolangElasticsearchApache KafkaApache HiveApache PigMapReduceMachine LearningPythonApache Spark - $35 hourly
- 5.0/5
- (32 jobs)
Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, Trino, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data CatalogApache NiFi
SQLAWS GluePySparkApache CassandraETL PipelineApache HiveApache KafkaBig DataApache HadoopScalaApache Spark - $50 hourly
- 5.0/5
- (6 jobs)
I am a qualified and professional software engineer. Have proactively taken the lead in designing, analysis, and implementation of critical large scale projects like a search tool on petabytes of data collaborating and organizing with multiple teams and leveraging multiple technologies to build this system. Built data pipelines in the terabyte scale ingesting logs from sources such as AWS S3, SQS, Apache Kafka and more via Apache NiFi to the monitoring tool (Splunk)Apache NiFi
Amazon S3Amazon DynamoDBMATLABCApache AirflowTerraformJiraMySQLC++AnsiblePythonJavaScriptData Scraping - $35 hourly
- 4.9/5
- (8 jobs)
Data Engineering & Expert Technology Consultant ✔️ Data Engineering - Airflow, PySpark, Redis, Jenkins, SQL, Kafka ✔️ Machine Learning - Data Analysis (Segmentation, Regression), Ensemble Learning, NLP (SpaCy, NLTK), Clustering ✔️ Data Scraping - Selenium, BS4 ✔️Languages & Databases - Python, Scala, SQL, MongoDB ✔️ Microservices - Flask, Django ✔️ AWS - SNS, S3, Athena, CloudWatch, Glue I have a history of building scalable Software systems, end-to-end Machine Learning solutions, Data Analysis architectures, and providing mentorships on various technologies. You are just a click away from a success story created for your business.Apache NiFi
RedisAWS LambdaMongoDBSoftware DevelopmentPySparkNatural Language ProcessingApache KafkaApache AirflowETL PipelineSQLPythonMicrosoft Excel - $40 hourly
- 5.0/5
- (6 jobs)
I am Aliabbas Bhojani, Data Engineer with profound knowledge and experience in the core functionality of Data Engineering, Big Data Processing and Cloud Data Architecture. I have completed by Bachelor in Engineering with the specialisation in Computer Engineering which has helped me to target complex data problems and have proved my expertise by suggesting the high performant cloud data architecture which can help to scale the business. I'm very familiar with a wide variety of web platforms and infrastructure, so don't be afraid to run something by me for things like Apache Spark, Apache NiFi, Kafka, Apache Accumulo, Apache Base, Zookeeper, REST APIs, Java, Python, Scala and JavaScript. I can work on your on-prem or cloud deployed solution so whether it's setting up Kubernetes, Docker, VMs on Azure, Amazon Web Services(AWS) or Google Cloud Platform(GCP). Wide Spectrum of Offering: - Data Engineering Core Values - Data Driven Business Intelligence - Automated Real Time Data Pipelines - Advance Machine Learning based Data Analytics - Relational and Non Relational Data Modelling - Cloud native data products - Big Data Handling with Apache Spark and Apache NiFi - Open Source Data Tools Usage and Mindset - AWS Cloud Data Architecture and Engineering - Azure Cloud Data Architecture and Engineering - GCP Cloud Data Architecture and Engineering - Scaling Data Pipelines with Kubernetes and Docker - No Down Time Data Pipeline using Cloud Agnostic Approach Feel free to reach out in terms of any inquiries and project discussion Aliabbas BhojaniApache NiFi
SnowflakeCloud ArchitectureData LakeApache AccumuloETLDevOpsMachine LearningPySparkApache SparkPythonJavaSQLData EngineeringApache Hadoop - $15 hourly
- 5.0/5
- (3 jobs)
I am Big Data Engineer with expertise in Hadoop, Cloudera and Horton Works Distributions and also Azure Data Services proficiency. Having good experience in all trending popular tools and technologies like Azure: Azure Data Factory, Azure Logic Apps, Azure Function apps, Azure Event Hub and Azure Service bus, Azure SQL DB. Apache: Apache Spark, Apache NIFI, Apache Kaka , Apache Hive. Having strong knowledge in programming languages like Java, Scala and Python. Also have good knowledge in SAP process.Apache NiFi
Microsoft AzureETL PipelineApache CassandraApache HiveApache HadoopDatabase DesignApache SparkApache KafkaElasticsearch - $50 hourly
- 0.0/5
- (0 jobs)
Experienced ML Engineer with expertise in MLOps, Model deployment , and data analysis. Skilled in designing and deploying data pipelines (ETL) for efficient data processing. Proficient in leveraging machine learning algorithms for solving complex problems and optimizing model performance. Strong background in model evaluation, and hyperparameter tuning. Experience in deploying ML models in production environments.Apache NiFi
Apache SparkApache AirflowData ManagementData LakeData Warehousing & ETL SoftwareData EngineeringData AnalysisModel DeploymentMLOps - $50 hourly
- 0.0/5
- (0 jobs)
Profile Summary Experienced PySpark Developer with strong data warehousing expertise, specializing in ETL development using AWS Glue, Redshift, and Snowflake. Skilled in automating NiFi workflows with NIPYAPI and integrating diverse data sources using Python, Pandas, and REST APIs.Apache NiFi
Machine LearningArtificial IntelligenceETLData ExtractionpandasAmazon S3PySparkDjangoFastAPIApache AirflowSnowflakeMySQL ProgrammingAWS GluePython - $200 hourly
- 0.0/5
- (0 jobs)
Summary With 4 years of experience in analysis, design, and development of flows, I have honed my expertise in crafting scalable and efficient data integration solutions. I've also worked as a support engineer, creating and maintaining robust API endpoints. My experience spans across multiple projects, with a focus on agile methodologies, enabling me to deliver results quickly and iteratively. I bring a deep understanding of technology and a keen ability to translate business requirements into enterprise-level solutions. I take great pride in owning the process from start to finish, ensuring that every project is executed flawlessly. When I'm at the helm, you can rest assured that things will run smoothly! * Good experience in Banking and Financial , LifeScience domain. * Experience in identifying and fixing issues at the earliest. * Design, develop, and maintain scalable and efficient data pipelines using Apache NiFi.Apache NiFi
SQL - $30 hourly
- 5.0/5
- (42 jobs)
⭐⭐⭐⭐⭐ 𝟓 𝐒𝐭𝐚𝐫 𝐟𝐞𝐞𝐝𝐛𝐚𝐜𝐤 | ✅ 𝟑𝟓 𝐂𝐨𝐦𝐩𝐥𝐞𝐭𝐞𝐝 𝐏𝐫𝐨𝐣𝐞𝐜𝐭 I am a Software Architecture with over 10+ years of professional experience in Web Application Development. ⭐𝐃𝐨𝐦𝐚𝐢𝐧 𝐄𝐱𝐩𝐞𝐫𝐭𝐢𝐬𝐞 ✓ E-Commerce ✓ Fintech/Banking ✓ Fleet management ✓ Ride sharing ⭐𝐓𝐨𝐨𝐥𝐬 & 𝐓𝐞𝐜𝐡𝐧𝐨𝐥𝐨𝐠𝐢𝐞𝐬 𝐀𝐈: ✓ ChatGPT ✓ Llama 2 ✓ Langchain ✓ Lama Index ✓ VectorDB (Milvus, Pinecone, Weaviate, Chroma, Faiss) 𝐁𝐚𝐜𝐤𝐞𝐧𝐝 𝐓𝐞𝐜𝐡𝐧𝐨𝐥𝐨𝐠𝐢𝐞𝐬: ✓ Spring Boot Microservices, Spring Boot Cloud, Spring Security ✓ AWS Integration (Elastic Beanstalk, EC2, RDS, Elastic Search, etc) ✓ Elastic Search / Integration / Querying ✓ Netflix Eureka, Netflix Ribbon, Netflix Feign, Docker ✓ RabbitMQ (Spring-AMQP), Kafka (Spring-Kafka and Kafka Stream), Spring Batch ✓ REST APIs integration 𝐃𝐀𝐓𝐀 𝐄𝐧𝐠𝐢𝐧𝐞𝐞𝐫𝐢𝐧𝐠: ✓ Apache Kafka, Apache Flink, AWS Glue, AWS Athena 𝐃𝐚𝐭𝐚 𝐕𝐢𝐬𝐮𝐚𝐥𝐢𝐳𝐚𝐭𝐢𝐨𝐧: ✓ Grafana, Tableau, QuickSight 𝐈𝐧𝐭𝐞𝐠𝐫𝐚𝐭𝐢𝐨𝐧: ✓ Zapier | Hubspot | Tray.io | Getstream 𝐃𝐀𝐓𝐀𝐁𝐀𝐒𝐄: ✓ MySQL | Postgres | MongoDB | AWS Dyanmodb | AWS Timestream | AWS Neptune | AWS Redshift | Cassandra 𝐕𝐞𝐫𝐬𝐢𝐨𝐧 𝐜𝐨𝐧𝐭𝐫𝐨𝐥: ✓ SVN | GIT (Github, Gitlab, BitBucket) | Mercury 𝐅𝐫𝐨𝐧𝐭 𝐄𝐧𝐝 𝐓𝐞𝐜𝐡𝐧𝐥𝐨𝐠𝐢𝐞𝐬: ✓ React Js| GraphQL| RX-JS| Redux 𝐂𝐈-𝐂𝐃: ✓ Jenkins | Bitbucket Pipeline| Gitlab Pipeline 𝐏𝐚𝐲𝐦𝐞𝐧𝐭 𝐆𝐚𝐭𝐞𝐰𝐚𝐲: ✓ Stripe | Square | Paypal | Chargebee I respect the time you have spent visiting my profile. 💬 You can DM me or Invite me if you think I am fitting in your requirements. Looking forward to connect with you. Thank you.Apache NiFi
TableauGrafanaAmazon Kinesis Video StreamsClickHouseApache DruidPythonApache FlinkAWS GlueApache KafkaJavaSpring Framework - $20 hourly
- 4.1/5
- (25 jobs)
I have a 12 years experience In Devops. I'm able build data pipelines on any cloud platform or even on promise. Experienced Python developer (also proficient in Typescript, R, SQL) with a data science background, experienced in building batch and streaming data pipelines, infrastructures, and bringing machine learning models to production. Familiar with common data tech stacks, architecture design, DevOps, and ML (Data Science) including Airflow, Kafka, Hadoop, PySpark, Kubernetes, AWS, Google Cloud Platform. The different services I can provide are the following. -Machine learning / Deep learning algorithms - Database migration (BigQuery, Redshift, Postgres, MySQL, etc). - Datawarouse architecture. - ETL developments (Python, Scala, Spark). - Data pipelines architecture and deployments (airflow, Kubernetes). - Applications containerization (Docker, Kubernetes) - Big data processing using Spark Scala - Building large Scale ETL - Could Management - Distributed platform development - Machine learning - Python Programming - Algorithm Development - Data Conversion (Excel to CSV, PDF to Excel, CSV to Excel, Audio) - Data Mining - CI/CD - Data extraction - ETL Data Transformation - Data Cleansing - OCR (Optical Character Recognition w/ Tesseract) - Linux Server Administration - Anaconda Python / Conda / Miniconda Administration - LXC/LXD Virtualization / Linux Containers - Website & Data Migrations I am highly attentive to detail, organised, efficient, and responsive. Let's get to work! 💪 Since my profile is brand new, I offer my service for low price because I need to improve my reputation.Apache NiFi
APIDockerVideo StreamSystem AdministrationApache AirflowAmazon Web ServicesAmazon RedshiftArtificial IntelligenceMachine LearningPythonDeep LearningNatural Language Processing - $15 hourly
- 0.0/5
- (1 job)
Experienced Data Engineer | Azure & Big Data Enthusiast 🚀 I am a Data Analytics Engineer with expertise in Big Data, Cloud Technologies, and Data Engineering. Over the past few years, I have worked on diverse data projects, optimizing pipelines, improving query performance, and architecting scalable solutions. 🔹 What I Can Do for You: ✅ Data Engineering & Big Data Solutions – Implement end-to-end ETL/ELT pipelines using Snowflake, DBT, Synapse, Spark, Kafka, Hadoop, Hive, Azure Data Factory, and Databricks. ✅ Cloud Data Solutions – Migrate and optimize databases from On-Prem SQL Servers & Oracle to Cloud Platforms like Azure & Snowflake. ✅ Data Pipeline Optimization – Reduce execution time and enhance performance with optimized SQL, DBT, and Python-based transformations. ✅ Data Visualization & Reporting – Build insightful Power BI Dashboards to provide meaningful business analytics. ✅ Generative AI & Automation – Deploy AI-driven frameworks and automation solutions using Azure OpenAI, Prompt Engineering, and Python. ✅ Streaming & IoT Data Processing – Design real-time streaming solutions using Apache Kafka, Spark Streaming, and Event Hub. 🔹 Why Work with Me? ✔ Microsoft Certified (DP-600, DP-203, DP-900, AZ-900) ✔ Hands-on experience with real-world enterprise projects ✔ Proven track record in performance optimization & automation ✔ Efficient problem solver with a keen eye for detail Let’s discuss how I can help you streamline your data workflows and optimize your business processes. Looking forward to collaborating! 📩 Contact Me Today!Apache NiFi
NoSQL DatabaseETL PipelineDatabricks PlatformSnowflakeMicrosoft AzureMicrosoft Power BIAzure OpenAI ServiceSqoopMachine LearningApache KafkaApache HadoopSQLApache HivePython - $40 hourly
- 2.5/5
- (4 jobs)
I am a highly experienced data science freelancer with over 6+ years of experience in the field. Throughout my career, I have developed a deep understanding of the principles and techniques of data science and have applied this knowledge to a wide range of projects. With a strong background in data analysis, machine learning, deep learning, and statistical modeling, I am able to quickly and accurately extract insights from large and complex datasets. My expertise in programming languages such as Python and SQL enables me to implement these insights in a scalable and efficient manner. In addition to my technical skills, I have a passion for using data science to drive business results. I have worked with a wide range of organizations, from startups to large corporations, and have helped them to use data to inform their decision-making, optimize their operations, and achieve their strategic goals. I am a self-starter and have a strong work ethic. I am able to work independently or as part of a team and I have excellent communication skills, which allow me to effectively collaborate with stakeholders at all levels of an organization. I have worked on various engagements across multiple domains for solving numerous problem statements including: Telecom: -Worked on Churn use case for the largest cellular company in the United States, analyzed their customer survey data, feedbacks, social media data to determine the customer experience and customer sentiments, demand identification and forecasting based on customer service records and customer engagement in different services, recommendation and planning of the marketing campaigns, offers and new personalized packs for customers based on customer history. Automotive: -Designed and deployed an automated application for a leading automotive testing company through which the client can see the machine failure prediction before time. The application also visualizes the predicted parameter values over time with graphs, data tables, and also the possible causes and remedies regarding the errors. -Worked on Paint shop defects analysis for the World’s largest car manufacturer BFSI: -Worked with asset management firms, investment management organizations, and developed solutions for fraud detection, fraud prediction, and credit risk analysis, stock prediction, investment planning, investment portfolio analytics. -Developed an accurate Cryptocurrency prediction model that can predict the hike and drop/crash of various coins, based on the historical data, social media data and market research data. NLP/Chatbots: -Developed Real-time Chatbot applications with . -Developed Custom NER Model for entity recognition. Logistics: -Conceptualized and implemented Route optimization algorithms for Transportation and Logistics companies, for identifying the best route and used a variety of customer data feeds and a complex optimization algorithm to compute and recommend the best route for fuel savings. Healthcare: -Developed and optimized novel deep learning-based approaches to automate many aspects of medicine, including; disease diagnosis, and preventative medicine. -Created a deep learning neural network that can process the MRI, PET-FDG, Amyloid and Tau image data from the ADNI database and developed a classification and prediction model to predict the Alzheimer's Disease.Apache NiFi
Deep Learning ModelingDockerPlotlyArtificial IntelligenceData VisualizationMatplotlibFlaskNatural Language ProcessingPythonPyTorchNumPypandasComputer VisionMachine Learning Model - $15 hourly
- 0.0/5
- (2 jobs)
Hi there, I'm a highly skilled and versatile cloud data engineer and DevOps engineer with over 6 years of experience in the field. My expertise lies in designing, building, and managing cloud infrastructure, data pipelines, and DevOps processes. Throughout my career, I've gained extensive experience working with a wide range of cloud computing platforms, including AWS, GCP, Azure, and Oracle Cloud. I'm well-versed in a variety of DevOps tools and scripting languages, including Jenkins, AWS CodePipeline, Terraform, Ansible, Bash/Shell Script, and Python Scripting, to name a few. My areas of specialization include containerization and orchestration using Docker and Kubernetes, as well as designing and implementing CI/CD pipelines for cloud-based applications. I have worked with various databases, including MySQL, Postgres, MongoDB, and Redis, and have experience with a variety of monitoring tools, such as Prometheus and Grafana, Kibana, and CloudWatch. DevOps tools & scripting languages that I've worked on- ======================================== Cloud Computing - AWS - Azure - Digital Ocean - GCP - Ali Cloud - Heroku - Oracle Cloud - Open Stack CI/CD - Jenkins - AWS Code Pipeline - AWS Code Deploy - AWS Code Build - GCP Cloud Build - Gitlab CI/CD - Bitbucket CI/CD - Bamboo - Github CI/CD - Heroku CI/CD Infrastructure As Code - Terraform - Cloud Formation - Ansible Scripting/Automation - Bash/Shell Script - Python Scripting Configuration Management Tools - Ansible - Puppet Monitoring - Check_MK - Nagios - Prometheous and Grafana - Elastic Search - Kibana - Logstash - Cloud Watch - Cloud Trail - Zabbix - Cacti - Pingdom - Data dog - Azure Monitor Containerization and Orchestration - Docker - Kubernetes - AWS EKS - AWS ECS - GCP GKS - Azure AKS - Azure ACS - OpenShift Web Servers - Nginx - Apache - Apache Tomcat - HAproxy - IIS Application Servers - PM2 - Puma - Node.js - PHP - ROR - .Net - Python/Django - Wildfly/Jboss - .Net Core - Java Version Control System - Bitbucket - AWS Code Commit - Azure Repos - Gitlab - Github Databases - MySQL - Postgres - MongoDB - MariaDB - Redis - Memcache - Memory Store - AWS RDS Service - MS SQL In addition to my technical skills, I have excellent communication and collaboration abilities, which enable me to work effectively with cross-functional teams, including developers, data scientists, and product managers. I'm also a proactive problem-solver who is always looking for ways to optimize cloud resources, increase efficiency, and reduce costs. I'm a results-driven professional who takes pride in delivering high-quality work and exceeding client expectations. I'm excited to bring my skills and expertise to your project and help you achieve your business goals. Thank you for considering my profile.Apache NiFi
GrafanaApache SparkAWS CodePipelineCI/CDApache KafkaServerless StackAWS LambdaAmazon RedshiftDocker ComposeAWS GlueAmazon RDSKubernetesPythonDocker - $50 hourly
- 4.4/5
- (1 job)
SUMMARY 9+ years of Data Engineering and Data science experience in Banking and Retail division with expertise in conceptualizing and implementing data pipelines in large scale methods using Agile that have significantly impacted business revenues and user experience. Led several ETL dataLake pipelines & Machine Learning initiatives involving the design, development and deployment of advanced algorithms comprising of Cloud Infrastructure Management.Apache NiFi
Cloud ComputingAmazon Web ServicesIaaSBatch Processing FrameworkMicroservicePySparkPythonApache Airflow - $15 hourly
- 0.0/5
- (1 job)
An enthusiast loves to work in the creative environment aiming to enhance my horizon.Apache NiFi
n8nAirtableFastAPIClickHousepandasReal Time Stream ProcessingdbtKubernetesPythonApache AirflowSQLApache KafkaDocker - $25 hourly
- 0.0/5
- (1 job)
Data Architecture , Data Engineering, Hadoop Administration, Big Data, Data Security, Docker, KubernetesApache NiFi
ETL PipelineETLApache HadoopBig DataPlatform MigrationData CleaningData ExtractionData Engineering - $5 hourly
- 0.0/5
- (0 jobs)
Experienced Frontend Developer at Ainqa with proficiency in HTML, CSS, JavaScript, React.js, ArangoDB, Git, Node.js, IoT, drones, robotics, and Arduino.Apache NiFi
ArangoDBCSS 3HTMLCompilerWeb UIJavaScriptGitMaterial UIReact - $20 hourly
- 0.0/5
- (0 jobs)
Hi, I'm Lavesh—a Big Data enthusiast specializing in Apache NiFi, Kafka, and end-to-end data workflow orchestration. I work at Ksolves, a publicly listed and CMMI Level 3 certified IT firm, where I support data engineering teams by streamlining data pipelines and optimizing flow-based architectures. What I Bring to Your Project: - Expertise in Apache NiFi: Data ingestion, routing, transformation, and delivery - Integration with Apache Kafka, Spark, and other Big Data tools - Real-time and batch processing design - Data security, provenance, and system reliability - Functional support, flow optimization, and issue triaging - Documentation and use-case mapping Whether you're looking to implement NiFi from scratch or optimize an existing data flow system, I can help you architect clean, scalable, and maintainable pipelines for your business. Let’s connect if you're looking to take control of your data workflows.Apache NiFi
Real-Time StrategyStream Processing FrameworkData Flow DiagramData EngineeringApache SparkETL PipelineApache KafkaData InterpretationData IntegrationBig Data - $25 hourly
- 0.0/5
- (0 jobs)
My profile provides a comprehensive overview of my technical capabilities, but to summarize, I specialize in front-end development with a growing focus on devOps methodologies. With extensive experience in crafting websites tailored for small and medium-sized businesses, I bring a unique blend of skills aimed at delivering efficient and user-centric digital solutions.Apache NiFi
Apache AirflowMaterial DesignGitReactJavaScriptjQueryBootstrapSCSSCSS 3CSSHTML - $20 hourly
- 0.0/5
- (0 jobs)
EXPERIENCE OVERVIEW * 20 years of over all experience in Software industry, out of which are as below. * 15 years of development experience in language Perl/Phython on Linux/unix platform with major exposure in projects dealing with web pages, data parsing , custom reporting and Service now integration. * 4+ years of experience as lead architect in integration domain. * 2+ Experience in integration with ServiceNow with apache nifi. * Very comfortable with Perl,Phython , Object Oriented Perl, CGI, Mod-perl on linux and Data Base Designing. * Experience in integration with ServiceNow, knows a Service Now development. * Familiar with aws for cloud for future projects. Experience in Python. * Experience in mod_perl, CGI, * Well versed with agile methodology. * Familiar with Shell Scripting, Awk, ServiceNow, Ruby, ROR, Chef, Python (data science), HTML, J-query * Recently getting familiarizing golang.Apache NiFi
ScrumGitLinuxWeb ApplicationJenkinsData ExtractionETLServiceNowPythonPerl - $20 hourly
- 0.0/5
- (0 jobs)
I'm a seasoned software developer with 15+ years of experience in building high-performance, scalable applications. My expertise lies in handling high-volume, high-speed financial data, particularly in the stock market domain across NSE, BSE, DGCX, and SGX. Key Expertise: * Financial & Stock Market Systems: Extensive experience in developing trading platforms and trading analysis tools. * Enterprise-Grade Solutions: Architecting and implementing robust, real-time systems for high-frequency data processing. * ETL Development: Proficient in designing and implementing ETL tools, with hands-on experience in multiple ETL projects. Technology Stack: Programming: Core Java, Spring Boot Databases: RDBMS, NoSQL Web Technologies: Frontend and backend web development Big Data: Expertise in big data technologies for large-scale processing With a deep understanding of financial markets and high-speed transactions, I specialize in delivering mission-critical solutions that drive efficiency and reliability.Apache NiFi
Enterprise Software DevelopmentSoftware ArchitectureAPI DevelopmentIntegration FrameworkWeb DevelopmentDesktop ApplicationProduct DevelopmentWeb ApplicationTrading StrategyTelecommunicationsStock MarketETLSpring BootJava - $20 hourly
- 0.0/5
- (0 jobs)
I’m an AWS and Snowflake Certified Data Engineer with strong experience in building scalable data pipelines, designing databases, and modeling data for analytics and reporting. I work with tools like Snowflake, dbt, Looker, SQL, PostgreSQL, AWS Glue, and Redshift to create efficient data workflows. I specialize in data integration, ETL development, and designing clean, reliable data architectures that support business needs. Whether you need help setting up a modern data stack, optimizing existing processes, or modeling data for dashboards and reports—I can deliver fast, high-quality results.Apache NiFi
Data EngineeringData WarehousingData ExtractionPySparkData MigrationSQLETLPostgreSQLApache AirflowAmazon RedshiftAWS GlueLookerdbtSnowflake - $10 hourly
- 0.0/5
- (0 jobs)
I am a Data Engineer with a passion for developing Data Driven Solutions. With 4 years of experience in Big Data About and Cloud, I specialize in Data Pipeline Development, developing Real time Data Processing frameworks and Database Migration to Cloud. Throughout my career, I've had the opportunity to take a central role in developing entire Data Ingestion and Processing frameworks from the ground up. I'm skilled in GCP, Azure, Apache Beam, Apache Airflow, Pyspark and SQL. One of my proudest accomplishments includes Developing Realtime Data Ingestion Pipelines for an OMS Application and Migrating and Redesigning a complex medallion Architecture from Azure to GCP.Apache NiFi
Apache SparkPythonSQLApache BeamApache AirflowDatabricks PlatformMicrosoft AzureGoogle Cloud Platform AdministrationGoogle Cloud PlatformETL - $20 hourly
- 0.0/5
- (0 jobs)
I’m a results-driven Full Stack Software Developer with deep experience in designing, developing, and deploying scalable applications across the entire tech stack. I specialize in backend development with Golang, Java, and Python, and deliver robust frontends using Angular, D3.js, HTML, CSS, Bootstrap, JavaScript/TypeScript. My expertise extends into cloud platforms (primarily AWS), where I work extensively with services like S3, Athena and Glue integrations, and manage complex ETL pipelines using Apache Airflow, Apache NiFi, and PySpark. I’m also well-versed in Snowflake for modern data warehousing and analytics. I bring a strong understanding of distributed systems, RESTful APIs, data processing, cloud-native architectures, and am passionate about building end-to-end solutions that are performant, maintainable, and business-aligned.Apache NiFi
PythonDockerD3.jsBootstrapCSSHTMLApache AirflowAWS LambdaPySparkAWS GluePostgreSQLAngularJavaGolang Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.