Hire the best Apache NiFi developers
Check out Apache NiFi developers with the skills you need for your next job.
- $95 hourly
- 5.0/5
- (24 jobs)
=> Let's Connect Hello, I'm Dima, a seasoned CyberSecurity Specialist and Turnkey Infrastructure Expert specializing in BigData solutions and data analysis, utilizing a DevOps approach. => Expertise Overview With a robust passion for constructing SOC, SOAR, and SIEM solutions, my primary focus lies in developing data ingestion, enrichment, and analysis pipelines, ensuring they are highly available and fault-tolerant. My expertise extends to building central logging and real-time processing platforms from the ground up, optimizing them for performance, security, and reliability across multiple environments, whether in the cloud or on-premise. => Value Proposition My commitment is to deliver solutions that not only centralize security and threat intelligence but also facilitate enhanced control over data, ultimately contributing to infrastructure cost savings. => Technological Summary CyberSecurity:------- > Wazuh, Suricata, pfSense BigData:--------------- > Kafka, ElasticSearch, OpenSearch Data Processing:----- > FluentD, Vector.dev, Apache NiFi Infra as Code:--------- > Terraform, cdktf, cdk8s Virtualization:--------- > Proxmox, VMware Containerization:----- > Kubernetes Clouds:---------------- > AWS, Hetzner, DigitalOcean, Linode Automation:----------- > Jenkins, GitHub Actions Monitoring:----------- > Zabbix, Grafana, Kibana, Prometheus, Thanos Mail:--------------------> MailCow SMTP/IMAP, Postfix VPN:------------------- > OpenVPN Server Programming:-------- > Bash, Python, TypeScript Operating Systems:- > CentOS, RHEL, Rocky Linux, Ubuntu, Debian => Personal Attributes • Leadership: Leading by example with a team-first approach • End-to-End Execution: Proficient from POC to Enterprise-level implementation • Resilience: Demonstrating high thoroughness and endurance • Adaptability: A quick, can-do architect and experienced troubleshooter • Optimization: Adept in process and performance optimization • Documentation: Skilled technical documentation writer • Vision: A visionary in technological implementation and solution provisionApache NiFi DevelopersElasticsearchLinux System AdministrationApache KafkaApache HadoopEmail SecurityMachine LearningELK StackClouderaZabbixMySQLBig DataApache NiFiPfSenseRed Hat AdministrationProxmox VEAmazon Web Services - $35 hourly
- 5.0/5
- (31 jobs)
Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, Trino, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data CatalogApache NiFi DevelopersSQLAWS GluePySparkApache CassandraETL PipelineApache HiveApache NiFiApache KafkaBig DataApache HadoopScalaApache Spark - $175 hourly
- 5.0/5
- (4 jobs)
Mr. Joshua B. Seagroves is a seasoned professional having served as an Enterprise Architect/Senior Data Engineer for multiple Fortune 100 Companies. With a successful track record as a startup founder and CTO, Mr. Seagroves brings a wealth of experience to his role, specializing in the strategic design, development, and implementation of advanced technology systems. Throughout his career, Mr. Seagroves has demonstrated expertise in architecting and delivering cutting-edge solutions, particularly in the realm of data engineering and sciences. He has successfully spearheaded the implementation of multiple such systems and applications for a diverse range of clients. As part of his current responsibilities, Mr. Seagroves actively contributes to the prototyping and research efforts in the field of data engineering/data science, specifically in the development of operational systems for critical mission systems. Leveraging his extensive background in architecture and software modeling methodologies, he has consistently led and collaborated with multidisciplinary teams, successfully integrating various distributed computing technologies, including Hadoop, NiFi, HBase, Accumulo, and MongoDB. Mr. Seagroves' exceptional professional achievements and extensive experience make him a highly sought-after expert in his field. His comprehensive knowledge and hands-on expertise in advanced technology systems and big data make him a valuable asset to any organization.Apache NiFi DevelopersYARNApache HadoopBig DataApache ZookeeperTensorFlowApache SparkApache NiFiApache KafkaArtificial Neural NetworkArtificial Intelligence - $50 hourly
- 5.0/5
- (8 jobs)
"She is very good in coding. She is the best and to go person for any hadoop or nifi requirements." "Abha is a star; have successfully handed the project in a very professional manner. I will definitely be working with Abha again; I am very happy with the quality of the work. 🙏" "Abha Kabra is one of the most talented programmers I have ever meet in Upwork. Her communication was top-notch, she met all deadlines, a skilled developer and super fast on any task was given to her. Perfect work is done. Would re-hire and highly recommended!!" Highly skilled and experienced Bigdata engineer with over 6 years of experience in the field. With a strong background in Analysis, Data Migration, Design, and Development of Big Data and Hadoop based Projects using technologies like following: ✅ Apache spark with Scala & python ✅ Apache NiFi ✅ Apache Kafka ✅ Apache Airflow ✅ ElasticSearch ✅ Logstash ✅ Kibana ✅ Mongodb ✅ Grafana ✅ Azure data factory ✅ Azure pipelines ✅ Azure databricks ✅ AWS EMR ✅ AWS S3 ✅ AWS Glue ✅ AWS Lambda ✅ GCP ✅ cloud functions ✅ PostgreSql ✅ MySql ✅ Oracle ✅ MongoDB ✅ Ansible ✅ Terraform ✅ Logo/Book Cover Design ✅ Technical Blog writing A proven track record of delivering high-quality work that meets or exceeds client expectations. Deep understanding of Energy-Related data, IoT devices, Hospitality industry, Retail Market, Ad-tech, Data encryptions-related projects, and has worked with a wide range of clients, from Marriott, P&G, Vodafone UK, eXate UK etc. Able to quickly understand client requirements and develop tailored solutions that address their unique needs. Very communicative and responsive, ensuring that clients are kept informed every step of the way. A quick learner and is always eager to explore new technologies and techniques to better serve clients. Familiar with Agile Methodology, Active participation in Daily Scrum meetings, Sprint meetings, and retrospective meetings, know about working in all the phases of the project life cycle. A strong team player and a leader with good interpersonal and communication skills and ready to take independent challenges.Apache NiFi DevelopersApache NiFiPySparkDatabricks PlatformETL PipelineBig DataGrafanaKibanaApache KafkaApache SparkPostgreSQLMicrosoft AzureMongoDBScalaPythonElasticsearchGoogle Cloud PlatformAmazon Web Services - $85 hourly
- 5.0/5
- (73 jobs)
FHIR, HL7v2, HL7v3, C-CDA, Carequality HIE, Mirth Connect, Apache NiFi, MU, EDI, EHR/EMR Functional Model. Back-end – JavaScript, Java, SQL, SmileCDR, Aidbox. • Certified HL7 FHIR R4 Proficiency • Certified HL7v2 Control Specialist • Certified HL7 CDA Specialist • Certified HL7v3 RIM Specialist • IHE Certified Professional - Foundations • SNOMED CT Terminology Services certified --- FHIR --- * Mirth Connect based FHIR integration, FHIR Server on Mirth (HAPI FHIR library). * FHIR (RESTful, event-based messaging and documents paradigms, profiling with Forge). * HL7v2 to/from FHIR mapping (e.g., ADT, ORU, OML message types). * C-CDA Level 1, Level 3 to/from FHIR mapping. * FHIR tools (Touchstone, Simplifier, Smile CDR, Forge). * Canadian Core/Baseline FHIR profiles editor. * IG Publishing (IG Publisher, FSH - FHIR Shorthand, SUSHI). * Apache NiFi custom FHIR processors. --- CMS Compliance --- * US Core profiles / IPS profiles / CA Baseline profiles * CARIN Blue Button / CMS Blue Button 2.0 * Da Vinci PDEX Plan Net * Da Vinci PDEX US Drug Formulary * Da Vinci Payer Data Exchange (ePDx) --- HL7 --- * Mirth Connect based HL7v2/HL7v3 integration. * Apache NiFi custom HL7v2 processors * HL7v2 conformance profiles (documentation quality level 3). * Refined and constrained versions of HL7v3 interactions based on Visio models. * HL7v3 messaging (Batch wrapper, Query Infrastructure; Claims, Lab, Patient Administration, Personnel Management domains). * Conformance testing of HL7v2.x/HL7v3 interactions implementation. * Development of HL7v2.x and HL7v3 specifications and Implementation Guides using Messaging Workbench, RMIM Designer, V3Generator. * Canadian HIAL, OLIS, HRM interfaces. --- C-CDA (Consolidated CDA) --- * CDA parsing library for Mirth Connect. * Document templates (e.g., CCD, NHSN CDA). * Mirth Connect based C-CDA document templates implementation and transformation. * Development of CDA templates specifications (Level 2, Level 3 section and entry templates). * CDA document templates modeling (MDHT Modeling or ART-DECOR). * Conformance testing of C-CDA documents. --- EHR / EMR / PHR --- * Software development of EMR solutions (using Mirth Connect, Java, JavaScript, XML Schema, XSLT, Schematron). * HL7 EHR System Functional Model and Profiles (e.g., Meaningful Use Functional Profile for ONC/NIST Test Procedures, HL7 PHR System FM). --- IHE ITI Profiles --- * Carequality HIE (XCPD, XCA) * OpenHIE * IHE profiles specifications and development: XDS, XDS.b, XDS-I.b, XCA, XCPD, MPQ, DSUB, XDM. * IHE HL7v3 profiles: PIXv3, PDQv3. * IHE FHIR profiles: MHD, PIXm, NPFSm, PDQm, mRFD. * Audit and security domains: ATNA, BPPC, IUA, XUA. Experience with: SmileCDR, Carequality, Quest, LabCorp, AllScripts, eClinicalWorks (eCW), CRISP, MUSE, OpenHIE, etc.Apache NiFi DevelopersAPI DevelopmentElectronic Data InterchangeFHIRJavaMirth ConnectHealth Level 7Apache NiFiElectronic Medical RecordHIPAAECMAScript for XMLXSLTXMLAPI IntegrationJavaScript - $18 hourly
- 4.7/5
- (18 jobs)
Who am I: More than 20 years of mastering and universal knowledge gives me the ability to combine a wide range of technology, make complicated heterogeneous systems and automate business processes. Last 5 years I have worked with Alfresco Content Services. My greatest value is universality. I deeply understand technology starting from the electron crossing Fermi level in semiconductors to the business process automation in the organizational structure of large companies. That's why, I can find out almost any integration solution. What we can do: We can make a deployment of Alfresco Content Services in test, development and production environments. Upgrade and migrate it from previous versions. Create a backup and disaster recovery plans. We can integrate it in the user environment, synchronise users with a centralised authentication management system, make SSO login, choose document access, editing, OCR technologies e.t.c. Integrate Alfresco in your corporate ecosystem, applications, api gateways, databases. We can make customer data models, add document classifications, and additional metadata. Create a business process automation of document management. We can create a production environment for any application with Docker/Kubernetes, a development environment with version control and automated CI/CD pipelines with, for exampl, on-promise Gitlab or in the GCP. Short list of base technologies: - Docker, Docker Compose, Kubernetes... - Linux, Debian, Bash, Python... - Git, Gitlab, CI/CD, DevOps... - Nginx, proxy, DNS, SMTP... - SSL/TLS, Kerberos, SSO, SALM... - Java, Javascript, SQL, PostgreSQL, CMIS... - Apache, Tomcat, NiFi, Elasticsearch/Kibana, WSO2, QGIS... - Google Cloud Platform, any cloud and hosting solutions....Apache NiFi DevelopersDockerKubernetesJavaScriptDocker ComposeElasticsearchApache TomcatAlfresco Content ServicesKerberosSSLJavaApache NiFiLinux System AdministrationLinuxGoogle Cloud Platform - $225 hourly
- 5.0/5
- (10 jobs)
As a skilled data engineer, developer, and biomedical engineer, I have spent the last several years honing my expertise to offer top-tier services to clients. My specialty is in data engineering, including data warehousing, ETL optimization, data modeling, data governance, and data visualization. I also have a strong background in machine learning and MLOps. This has allowed me to develop a command of a wide range of technologies, including Hadoop, DataBricks, Docker, Terraform, Apache Spark, Collibra, big data, business intelligence tools, cloud computing (AWS, GCP, and Azure), TensorFlow, Keras, and Kubernetes. My preferred languages for data projects are Python, R, SQL, Rust, and Java. I'm also proficient in several other programming languages for full-stack development including JavaScript, TypeScript, HTML and CSS, among others. I've included examples of recent projects in my portfolio to showcase my capabilities. Recently, I completed two competitive fellowships: the DS4 Data Engineering Fellowship and the Aspen Tech Hub Tech Policy Fellowship. The Aspen Tech Hub Tech Policy Fellowship focused on training expert tech leaders in public policy and regulations in San Francisco. During this fellowship, I learned about outputs like memos, briefs, whitepapers, action plans, and presented real technical solutions to government and institutional stakeholders. During the DS4 Data Engineering Fellowship, I worked with over 32 million collective data records from the Center for Medicaid and Medicare Open Payment, Merit-Based Incentive Payment System (MIPS) Final Scores, and World Health Organization Global Healthcare Expenditures data. This experience led my team to be awarded the distinguished project award and receiving an honors certificate. To optimize data storage and processing efficiency, I implemented a strategic approach by converting raw files into Parquet format, a columnar storage file format. By leveraging the Snappy compression algorithm, I significantly reduced the memory needed to store the data by more than 83%. This transformation not only streamlined the data retrieval process but also enhanced the overall performance of data processing pipelines. My role involved developing sophisticated data models, designing efficient pipelines, and leveraging various tools such as PySpark, PyArrow, Pandas, Python, Airflow, and an AWS PostgreSQL instance. As a medical device specialist, I have a wealth of experience in regulatory and quality control, as well as all areas of the product life cycle. I have worked on class 1, class 2, and class 3 devices in several areas, including software applications, dermatological, diagnostic biotech, diagnostic radiology, orthopedic, cardiac, urology, and physical therapy devices. I am skilled in completing regulatory documents such as design history folders, risk assessments, 510(k), DIOVV, quality plans, and verification/validation testing, and have included examples of these documents and public policy papers I have written in my portfolio. I offer services in data engineering, machine learning, medical devices, and overall software development, and I am confident that my skills and experience can deliver results that exceed your expectations. Please feel free to contact me to discuss your project and how I can help you achieve your goals. Let's connect and make your project a success.Apache NiFi DevelopersDashboardData VisualizationData MiningETLAPI DevelopmentCompliance ConsultationRustHIPAASolidityMedical DeviceBlockchainData EngineeringMachine LearningData ScienceSQL - $35 hourly
- 5.0/5
- (7 jobs)
I am a data engineer expert with over than 5 years experience in data ingestion, integration and manipulation. Till date, I have done many projects in data engineering and big data. I worked on business analytics and telco analytics, i used multy data platforms and framework such as Cloudera data platform, Nifi, R studio, Spark, Hadoop, Kafka ... If this is what you want, then get in touch with meApache NiFi DevelopersCloud EngineeringClouderaApache HadoopData WarehousingApache NiFiLinuxApache SparkData LakeData AnalysisSQLBig DataBusiness IntelligenceScalaApache HivePython - $70 hourly
- 5.0/5
- (42 jobs)
🎓 𝗖𝗲𝗿𝘁𝗶𝗳𝗶𝗲𝗱 𝗗𝗮𝘁𝗮 𝗣𝗿𝗼𝗳𝗲𝘀𝘀𝗶𝗼𝗻𝗮𝗹 with 𝟲+ 𝘆𝗲𝗮𝗿𝘀 of experience and hands-on expertise in Designing and Implementing Data Solutions. 🔥 4+ Startup Tech Partnerships ⭐️ 100% Job Success Score 🏆 In the top 3% of all Upwork freelancers with Top Rated Plus 🏆 ✅ Excellent communication skills and fluent English If you’re reading my profile, you’ve got a challenge you need to solve and you are looking for someone with a broad skill set, minimal oversight and ownership mentality, then I’m your go-to expert. 📞 Connect with me today and let's discuss how we can turn your ideas into reality with creative and strategic partnership.📞 ⚡️Invite me to your job on Upwork to schedule a complimentary consultation call to discuss in detail the value and strength I can bring to your business, and how we can create a tailored solution for your exact needs. 𝙄 𝙝𝙖𝙫𝙚 𝙚𝙭𝙥𝙚𝙧𝙞𝙚𝙣𝙘𝙚 𝙞𝙣 𝙩𝙝𝙚 𝙛𝙤𝙡𝙡𝙤𝙬𝙞𝙣𝙜 𝙖𝙧𝙚𝙖𝙨, 𝙩𝙤𝙤𝙡𝙨 𝙖𝙣𝙙 𝙩𝙚𝙘𝙝𝙣𝙤𝙡𝙤𝙜𝙞𝙚𝙨: ► BIG DATA & DATA ENGINEERING Apache Spark, Hadoop, MapReduce, YARN, Pig, Hive, Kudu, HBase, Impala, Delta Lake, Oozie, NiFi, Kafka, Airflow, Kylin, Druid, Flink, Presto, Drill, Phoenix, Ambari, Ranger, Cloudera Manager, Zookeeper, Spark-Streaming, Streamsets, Snowflake ► CLOUD AWS -- EC2, S3, RDS, EMR, Redshift, Lambda, VPC, DynamoDB, Athena, Kinesis, Glue GCP -- BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Data Fusion Azure -- Data Factory, Synapse. HDInsight ► ANALYTICS, BI & DATA VISUALIZATION Tableau, Power BI, SSAS, SSMS, Superset, Grafana, Looker ► DATABASE SQL, NoSQL, Oracle, SQL Server, MySQL, PostgreSQL, MongoDB, PL/SQL, HBase, Cassandra ► OTHER SKILLS & TOOLS Docker, Kubernetes, Ansible, Pentaho, Python, Scala, Java, C, C++, C# 𝙒𝙝𝙚𝙣 𝙮𝙤𝙪 𝙝𝙞𝙧𝙚 𝙢𝙚, 𝙮𝙤𝙪 𝙘𝙖𝙣 𝙚𝙭𝙥𝙚𝙘𝙩: 🔸 Outstanding results and service 🔸 High-quality output on time, every time 🔸 Strong communication 🔸 Regular & ongoing updates Your complete satisfaction is what I aim for, so the job is not complete until you are satisfied! Whether you are a 𝗦𝘁𝗮𝗿𝘁𝘂𝗽, 𝗘𝘀𝘁𝗮𝗯𝗹𝗶𝘀𝗵𝗲𝗱 𝗕𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗼𝗿 𝗹𝗼𝗼𝗸𝗶𝗻𝗴 𝗳𝗼𝗿 your next 𝗠𝗩𝗣, you will get 𝗛𝗶𝗴𝗵-𝗤𝘂𝗮𝗹𝗶𝘁𝘆 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀 at an 𝗔𝗳𝗳𝗼𝗿𝗱𝗮𝗯𝗹𝗲 𝗖𝗼𝘀𝘁, 𝗚𝘂𝗮𝗿𝗮𝗻𝘁𝗲𝗲𝗱. I hope you become one of my many happy clients. Reach out by inviting me to your project. I look forward to it! All the best, Anas ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Muhammad is really great with AWS services and knows how to optimize each so that it runs at peak performance while also minimizing costs. Highly recommended! ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ You would be silly not to hire Anas, he is fantastic at data visualizations and data transformation. ❞ 🗣❝ Incredibly talented data architect, the results thus far have exceeded our expectations and we will continue to use Anas for our data projects. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ The skills and expertise of Anas exceeded my expectations. The job was delivered ahead of schedule. He was enthusiastic and professional and went the extra mile to make sure the job was completed to our liking with the tech that we were already using. I enjoyed working with him and will be reaching out for any additional help in the future. I would definitely recommend Anas as an expert resource. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Muhammad was a great resource and did more than expected! I loved his communication skills and always kept me up to date. I would definitely rehire again. ❞ ⭐️⭐️⭐️⭐️⭐️ 🗣❝ Anas is simply the best person I have ever come across. Apart from being an exceptional tech genius, he is a man of utmost stature. We blasted off with our startup, high on dreams and code. We were mere steps from the MVP. Then, pandemic crash. Team bailed, funding dried up. Me and my partner were stranded and dread gnawed at us. A hefty chunk of cash, Anas and his team's livelihood, hung in the balance, It felt like a betrayal. We scheduled a meeting with Anas to let him know we were quitting and request to repay him gradually over a year, he heard us out. Then, something magical happened. A smile. "Forget it," he said, not a flicker of doubt in his voice. "The project matters. Let's make it happen!" We were floored. This guy, owed a small fortune, just waved it away? Not only that, he offered to keep building, even pulled his team in to replace our vanished crew. As he spoke, his passion was a spark that reignited us. He believed. In us. In our dream. In what he had developed so far. That's the day Anas became our partner. Not just a contractor, but a brother in arms. Our success story owes its spark not to our own leap of faith, but from the guy who had every reason to walk away. Thanks, Anas, for believing when we couldn't.❞Apache NiFi DevelopersSolution Architecture ConsultationAWS LambdaETL PipelineData ManagementData WarehousingAWS GlueApache SparkAmazon RedshiftETLPythonSQLMarketing AnalyticsBig DataData VisualizationArtificial Intelligence - $75 hourly
- 5.0/5
- (49 jobs)
As a freelancer and data engineer, I concentrated on databases and ETL related projects, queries performance, and database structure optimization. Work with many types of databases for more than 15 years, primarily with PostgreSQL, MySQL, MS SQL, Redshift, but on projects work with many others, such as Snowflake, Cloudera, DB2, Oracle. Big portfolio of ETL projects with Talend Data Integration, NiFi. Certified Talend Developer (Data Integration, BigData) Microsoft Certified Professional Continuously extend my expertise and knowledge. Open for new challenges.Apache NiFi DevelopersSnowflakeSQL ProgrammingAmazon RedshiftData AnalysisApache NiFiMicrosoft SQL Server ProgrammingSQLDatabase DesignData MigrationPostgreSQLDatabase AdministrationETLMySQLTalend Open Studio - $72 hourly
- 3.8/5
- (14 jobs)
Hands on expertise and project based experience with the following skills : 1) Java, Go, Python based Cloud (AWS/GCP/Azure) API(s) & services development 2) Database : PostgreSql, MariaDB, MongoDB, Aerospike,, Couchbase, ScyllaDB, Redis 3) REST/GraphQL: Go {gorilla. gin. gonic. graphql-go}. Python{Flask/Django}, Java{Spring Boot} 4) Cloud: AWS, GCP, AzureApache NiFi Developers - $100 hourly
- 5.0/5
- (138 jobs)
— TOP RATED PLUS Freelancer on UPWORK — EXPERT VETTED Freelancer (Among the Top 1% of Upwork Freelancers) — Full Stack Engineer — Data Engineer ✅ AWS Infrastructure, DevOps, AWS Architect, AWS Services (EC2, ECS, Fargate, S3, Lambda, DynamoDB, RDS, Elastic Beanstalk, AWS CDK, AWS Cloudformation etc.), Serverless application development, AWS Glue, AWS EMR Frontend Development: ✅ HTML, CSS, Bootstrap, Javascript, React, Angular Backend Development: ✅ JAVA, Spring Boot, Hibernate, JPA, Microservices, Express.js, Node.js Content Management: ✅ Wordpress, WIX, Squarespace Big Data: ✅ Apache Spark, ETL, Big data, MapReduce, Scala, HDFS, Hive, Apache NiFi Database: ✅ MySQL, Oracle, SQL Server, DynamoDB Build/Deploy: ✅ Maven, Gradle, Git, SVN, Jenkins, Quickbuild, Ansible, AWS Codepipeline, CircleCI As a highly skilled and experienced Lead Software Engineer, I bring a wealth of knowledge and expertise in the areas of Java, Spring, Spring Boot, Big Data, MapReduce, Spark, React, Graphics Design, Logo Design, Email Signatures, Flyers, Web Development (HTML, CSS, Bootstrap, JavaScript & frameworks, PHP, Laravel), responsive web page development, Wordpress and designing, and testing. With over 11 years of experience in the field, I have a deep understanding of Java, Spring Boot, and Microservices, as well as Java EE technologies such as JSP, JSF, Servlet, EJB, JMS, JDBC, and JPA. I am also well-versed in Spring technologies including MVC, IoC, security, boot, data, and transaction. I possess expertise in web services, including REST and SOAP, and am proficient in various web development frameworks such as WordPress, PHP, Laravel, and CodeIgniter. Additionally, I am highly skilled in Javascript, jQuery, ReactJs, AngularJs, Vue.Js, and Node. C#, ASP.NET MVC In the field of big data, I have experience working with MapReduce, Spark, Scala, HDFS, Hive, and Apache NiFi. I am also well-versed in cloud technologies such as PCF, Azure, and Docker. Furthermore, I am proficient in various databases including MySQL, SQL Server, MySql, and Oracle. I am familiar with different build tools such as Maven, Gradle, Git, SVN, Jenkins, Quickbuild, and Ansible.Apache NiFi DevelopersApache SparkDatabaseWordPressCloud ComputingSpring FrameworkData EngineeringNoSQL DatabaseReactServerless StackSolution Architecture ConsultationSpring BootDevOpsMicroserviceAWS FargateAWS CloudFormationJavaCI/CDAmazon ECSContainerization - $25 hourly
- 5.0/5
- (19 jobs)
Certification in Big Data/Hadoop Ecosystem Big Data Environment: Google Cloud Platform, Cloudera, HortonWorks and AWS, SnowFlake, Databricks, DC/OS Big Data Tools : Apache Hadoop, Apache Spark, Apache Kafka, Apache Nifi, Apache Cassandra, Yarn/Mesos, Oozie, Sqoop, Airflow, Glue, Athena, S3 Buckets, Lambda, Redshift, DynamoDB ,Delta Lake, Docker, GIT, Bash Scripts Jenkins, Postgres, MongoDB, Elastic Search, Kibana, Ignite, TiDB Certification SQL Server, Database Development and Crystal Report. SQL Server Tools: SQL Management Studio, BIDS, SSIS, SSAS and SSRS BI/Dashboarding Tools: Power BI, Tableau, Kibana Big Data Development Programing Languages: Scala and python. ======================================================================= ************************************* Big Data Engineer********************************************** Hands on experience with Google cloud platform, Big Query, Google Data Studio and Flow Developing ETL pipeline for SQL server as well using SSIS. For Reporting and Analysis using SSIS, SSRS and SSAS cubes. Having amazing experience with Big data framework and open source technologies (Apache Nifi, Kafka, Spark and Cassandra, HDFS, Hive Docker/Cassandra/ Postgres SQL, Git, Bash Scripts Jenkins, MongoDB, Elastic Search, Ignite, TiDB. Managing data warehouse Big Data cluster services and developments of Data Flows. Writing big data/Spark ETL applications for different sources (SQL, Oracle, CSV, XML,JSON) to support different department for analytics. Extensive work with Hive, Hadoop, Spark, Docker, Apache Nifi Supporting different department for big data analytics. Build multiple end to end Fraud monitoring alert based systems. Preferable language is Scala and python as well. ************Big Data Engineer– Fraud Management at VEON ************* Devolved ETL Pipeline from Kafka to Cassandra using Spark in Scala Language. Using Big Data Tools with Horton Works and AWS (Apache Nifi, Kafka, Spark and Cassandra, Elastic Search) Dashboard Developments - Tableau and Kibana. Writing SQL server complex queries, procedures and Functions. Developing ETL pipeline for SQL server as well using SSIS. For Reporting and Analysis using SSIS, SSRS and SSAS cubes. Developing and designing Auto Email Reports. Offline Data Analytics for Fraud Detection and Setting up controls for prevention. SQL Database Development. System Support of Fraud Management.Apache NiFi DevelopersGoogle Cloud PlatformSQL ProgrammingData WarehousingDatabaseAWS GluePySparkMongoDBPython ScriptDockerApache HadoopApache SparkDatabricks PlatformApache KafkaApache Hive - $35 hourly
- 5.0/5
- (28 jobs)
I'm senior software engineer with very good knowledge of Java, and Business process management. the following list will show my knowledge areas : 1-Business Process Management (BPM). - (Camunda), (Activiti), (JBPM), (Bonita). 2-Business Process Model and Notation (BPMN). 3-Process Modeling and Workflow Design. 4- software engineering - OOP, Design patterns, Agile, Scrum 5- JAVA stack -Java, java EE, JDBC, JPA, Hibernate, -Spring framework including (IOC container, MVC, AOP, security, data , REST and spring Boot, JDBC Template) 6-DATABASE -Oracle(SQL, PLSQL) -SQL server -MysqlApache NiFi DevelopersBusiness Process ModelingjBPMBusiness Process ManagementProject WorkflowsProcess ModelingBusiness Process Model & NotationBonitaSpring BootSQLHibernate - $85 hourly
- 4.6/5
- (15 jobs)
- 17 years of experience in Data Science, Data warehouse, Business Intelligence, advanced analytics, ETL/ELT, Data visualization, Virtualization, database programming and data engineering. - Experience in Machine learning, especially on customer360, linear regression and decision trees. - Specialized in the end to end of Business Intelligence and analytics implementations - ER (Entity Relationship) modeling for OLTP and dimension modeling (conceptual, Logical, Physical) for OLAP. - Have experience running startup companies and building SaaS products, including CRM (Customer Relationship Management) and Data Orchestration low-code tools. - Experience working in Agile Scrum and methodologies (2 and 3-week sprints) - Excellent communication skills, a good understanding of business and client requirements - Good at technical documentation, POCs (Proof of Concepts) - Good at discussions with Stakeholders for requirements and Demos - Convert business requirements into Technical design documents with pseudo-code - Dedicated, work with minimal supervision. - Eager to learn new technologies, can explore and learn and develop quickly on client-owned applications. - Expert in SQL, T-SQL, PLSQL, knows advanced functions and features of it, good at database programming. - good at Performance Tuning, clustering, Indexing, Partitioning and other DBA activities. - DBA activities like Database backup/recovery, monitoring Database health, killing Long-running queries and suggesting better tuning options. - Good at database programming and normalized techniques (all 3 normal forms). - Expert in Azure Synapse, PostgreSQL, MongoDB, Dynamo DB, Google Data Studio, Tableau, Sisense, SSRS, SSIS, and more. - Domain knowledge in Telecom, Finance/Banking, Automobile, Insurance, Telemedicine, Healthcare and Virtual Clinical Trials (CT). - Extensive DBA knowledge and work experience in SQL Server, Login management, database backup and restore, monitoring database loads, and tuning methods. - Exceptionally well in Azure ML and regression models Expertise: Database: Snowflake, Oracle, SQL Server, Azure SQL Database, MS Access Database, Azure Synapse Analytics, Teradata, Mysql, No SQL, PostgreSQL, and MongoDB ETL: Azure Data Factory, DBT, SSIS, AWS Glue, Matillion CDC & ETL, Google Big Query, Informatica PC and Cloud, ODI, Data Stage, MSBI (SSIS, SSAS) Reporting/Visualization: Sisense, QlikSense, Sigma Computing, Metabase, Qlikview, SSRS, Domo, Looker, Tableau, Google Data Studio, Amazon QuickSight and PowerBI Scripting Language: Unix, Python, VBA and R Cloud Services: Google Cloud Platform (Big Query, Cloud functions, Data Studio), MS Azure (Azure Blob Storage, Azure Functional Apps, Logic Apps, Azure Data Lakehouse, Databricks, Purview, ADF and Microservices), Azure ML, AWS RDS EC2, S3, and Amazon Redshift, Step functions, Data Pipelines. Data Virtualization: DenodoApache NiFi DevelopersC#SnowflakeETLData WarehousingBusiness IntelligenceData VisualizationAzure Machine LearningQlik SenseLookerSisenseMicrosoft Power BISQLTableau - $90 hourly
- 5.0/5
- (8 jobs)
Accomplished Data Engineer with multiple years of experience designing, implementing, and optimizing data analytics solutions for cloud-based big data environments. Proficient in building robust data pipelines, ETL/ELT processes, data warehousing, and business intelligence platforms. Expertise in AWS services such as Glue, Athena, EMR, and Kinesis, as well as Snowflake's cloud data platform. Strong background in Python, SQL, Apache Spark,, and Kafka. AWS certifications in Data Engineering, Big Data, and Data Analytics. Snowflake SnowPro Core certified. Expert Data Engineer | AWS Certified Data Analytics Specialist | Snowflake SnowPro Core Certified | Big Data Architecture | Cloud Data Warehousing | Data Lakehouse| Data Lake | ETL/ELT Pipelines | Data Modeling | Business Intelligence | Machine Learning | Python | SQL | Apache Spark | Kafka | Kinesis | AWS Glue | AWS Athena | AWS EMR | Data Quality | Data Security | Apache IcebergApache NiFi DeveloperspandasApache AirflowSolution ArchitectureData ModelingAWS GlueETLApache KafkaAmazon AthenaData ManagementPython - $35 hourly
- 5.0/5
- (11 jobs)
I have 18+ years of experience in software development in Telecom, Banking, and Healthcare domains. Primary skillsets include Big Data eco-systems (Apache Spark, Hive, Map Reduce, Cassandra), Scala, Core Java, Python, C++. I am well versed in designing and implementing Big data solutions, ETL and Data Pipelines, Serverless and event-driven architectures on Google Cloud Platform (GCP), and Cloudera Hadoop 5.5. I like to work with organizations to develop sustainable, scalable, and modern data-oriented software systems. - Keen eye on scalability, sustainability of the solution - Can come up with maintainable & good object-oriented designs quickly - Highly experienced in seamlessly working with remote teams effectively - Aptitude for recognizing business requirements and solving the root cause of the problem - Can quickly learn new technologies Sound experience in following technology stacks: Big Data: Apache Spark, Spark Streaming, HDFS, Hadoop MR, Hive, Apache Kafka, Cassandra, Google Cloud Platform (Dataproc, Cloud storage, Cloud Function, Data Store, Pub/Sub), Clouder Hadoop 5.x Languages: Scala, Python, Java, C++, C Build Tools: Sbt, Maven Databases: Postgres, Oracle Worked with different types of Input & Storage formats: CSV, XML, JSON file, Mongodb, Parquet, ORCApache NiFi DevelopersC++JavaApache SparkScalaApache HadoopPythonApache CassandraOracle PLSQLApache HiveClouderaGoogle Cloud Platform - $50 hourly
- 4.7/5
- (48 jobs)
🗲 Available now 🥇 Top rated ⏲️ 12,000+ Upwork hours ⭐ 100%+ rating ✅ Full-time and Long term 🌎 24x7 support 😝 Fun loving 10 years experience as a DevOps Engineer with extensive experience in all cloud providers like AWS, GCP and Azure My blog : roshannagekar.blogspot.com I have vast experience in the following tools and technologies: ⚙⚙⚙⚙⚙ 🄳🄴🅅🄾🄿🅂 ⚙⚙⚙⚙⚙ ★ Configuration Management: Ansible, Chef, Puppet ★ AWS Administration: EC2, RDS, S3, CloudFront ★ CI Tools: Jenkins, Bamboo ★ Deployment: Fabric, Capistrano, Ansible ★ Provisioning: Terraform, Cloudformation ★ Cloud Providers: AWS, GCP, Azure, Digital Ocean ★ Containerization: Docker Compose, Kubernetes ★ Source Code: Git, SVN, Gitlab, Github ★ Monitoring (Free): Nagios, Zabbix, Prometheus ★ Monitoring (Paid): Newrelic, Datadog, Pingdom ★ On-Call Support: VictorOps, Pagerduty ★ High Availability: HAProxy, F5, ELB, Envoy ★ Scripting: Bash, Python, Ruby ★ Documentation: Confluence, Twiki ★ Infra. Testing: Test-Kitchen, Apache-Benchmark ★ Databases: MySQL, PostgreSQL, Oracle ★ Servers: Apache, Nginx, IIS, vsftpd ★ Operating System: Unix, Redhat/Ubuntu, Windows ★ Virtualization: VirtualBox, VMWare, Vagrant ★ Build Tools: Makefile, Ant, Maven ★ Ticketing: JIRA, HPQC, Bugzilla ★ QA Tools: Selenium, Sikuli, SoapUI ★ File Transfer: Filezilla, WinSCP, s3cmd 🛡🛡🛡🛡🛡 🄸🄽🄵🄾🅂🄴🄲 🛡🛡🛡🛡🛡 ⮕ Kali Linux ⭐⭐⭐⭐ ⮕ Metasploit ⭐⭐⭐ ⮕ Maltego ⭐⭐⭐ ⮕ Burpsuite ⭐⭐⭐ ⮕ Fiddler ⭐⭐⭐⭐ ⮕ Kismet ⭐⭐⭐ ⮕ Wirshark ⭐⭐⭐⭐ ⮕ Zaproxy ⭐⭐⭐⭐ ⮕ Wazuh ⭐⭐⭐⭐⭐ 📅 Devops Event Organizer: 📅 meetup.com/DevOps-Pune/members/?op=leadersApache NiFi DevelopersSoftware TestingDevOpsWordPressMySQLCustomer ServiceUnixTechnical WritingGoogle App EngineLAMP AdministrationAmazon Web Services - $65 hourly
- 4.8/5
- (14 jobs)
I am the Co-Founder and CEO of ITcare, as well as a seasoned Network Architect with over a decade of experience in the telecommunications industry. Throughout my career, I have worked with two of the largest ISPs in my country and a Tier 1 ISP as part of the Central Network Support Team (Level 3 Support). My expertise spans across network design, configuration, implementation, and network automation. Key Skills: • Extensive experience with Juniper routers and switches (MX, EX, SRX) • Proficient with Cisco IOS routers and switches (2900, 6500, 7200 series) and Cisco IOS-XR routers (ASR9k) • Skilled in Huawei and Raisecom PON network devices, Foundry/Brocade/EdgeCore/Arista switches, and Ericsson SmartEdge series routers • Expertise in network management systems like U2000, Zabbix, Cacti, and The Dude • Proficient in network automation using Ansible and Python scripting • Advanced knowledge of Linux systems (CentOS, Ubuntu) • Deep understanding of networking protocols and technologies, including xSTP, PIM, IS-IS, OSPF, BGP, MPLS, LDP/RSVP, and MPLS VPNs (L2/L3/VPLS) Certifications: • Juniper Networks Certified Internet Expert - Service Provider (JNCIE-SP #2975) • Juniper Networks Certified Internet Specialist - Security (JNCIS-SEC) • Juniper Networks Certified Internet Specialist - DevOps (JNCIS-DevOps) • Cisco Certified Network Associate - Routing & Switching (CCNA R&S) • Linux Professional Institute Certification - Level 1 (LPI-1) I bring a robust combination of technical expertise, leadership, and a commitment to continuous innovation. My focus is on leading ITcare to deliver efficient, secure, and scalable network solutions that ensure top-tier performance and reliability for our clients.Apache NiFi DevelopersExtreme NetworksJunos OSCisco Certified Internetwork ExpertNetwork AnalysisCisco Certified Network ProfessionalPythonCisco RouterOSPFNetwork EngineeringMultiprotocol Label SwitchingCisco IOSCisco Certified Network AssociateMultiprotocol BGPJuniper - $40 hourly
- 4.7/5
- (64 jobs)
Hi, I am Isha Taneja, highly skilled in Data Analytics, Engineering & Cloud Computing from Mohali, India. I am an expert in creating an ETL data flow in Talend Studio, Databricks & Python using best design patterns and practices to integrate data from multiple data sources. I have worked on multiple projects which require Data migration Data Warehousing development and API Integration. Expertise: 1. Migration - Platform Migration - Legacy ETL to Modern Data Pipeline / Talend / ERP Migration / CRM Migration - Data Migration - Salesforce migration / Hubspot Migration / Cloud Migration / ERP Migration 2. Data Analytics - Data Lake Consulting - Data Warehouse Consulting - Data Modelling / Data Integration / Data Governance / ETL - Data Strategy - Data Compliance / Data Deduplication / Data Reconciliation / Customized Data - Processing Framework / Data Streaming / API implementation / Data Ops - Business Intelligence - Digital marketing Analysis / E-commerce Analytics / ERP Reporting Capabilities - Big Data - Lakehouse Implementation 3. Software QA & Testing 4. Custom Application Development - UI/UX - Frontend Development - Backend Development 5. Cloud - Cloud-native Services/ AWS Consulting / Cloud Migration /Azure Consulting / Databricks /Salesforce 6. Business process automation - Bi-directional sync between applications / RPA A Data Professional and a ETL Developer with 10+ years of experience working with enterprises/clients globally to define their implementation approach with the right Data platform strategy, Data Analytics, and business intelligence solutions. My domain expertise lies in E-Commerce, Healthcare, HR Related, Media & Advertising, Digital Marketing You have the data? Great !! I can help you analyze it using Python. It involves performing exploratory data analysis, hypothesis testing, and data visualization. You have Big Data? Even Better !! I can help you clean, transform, store and analyze it using big data technologies and production it using cloud services like AWS and Azure Cloud. You want to track business KPIs and metrics? No Problem !! I can even help you develop reports using Tableau and PowerBI; this will always keep you ahead in your business. Specialities: Databases: Snowflakes, Postgres, Dynamo DB, Graph DB - Neo4j, Mongo DB, Data Warehouse concepts, MSSQL ETL-Tools: Talend Data Integration Suite, Matillion, Informatica, Databricks API Integration - Salesforce / Google Adwords / Google Analytics / Marketo / Amazon MWS - Seller Central / Shopify / Hubspot / FreshDesk / Xero Programming: Java, SQL, HTML, Unix, Python, Node JS, React JS Reporting Tools: Yellowfin BI, Tableau, Power BI, SAP BO, Sisense, Google Data Studio AWS Platform: S3, AWS Lambda, AWS Batch, ECS, EC2, Athena, AWS Glue, AWS Step Functions Azure Cloud Platform. Other Tools: Airflow Expect Integrity, Excellent communication in English, technical proficiency, and long-term support.Apache NiFi DevelopersDatabricks MLflowDatabricks PlatformTableauMicrosoft Power BIData ExtractionTalend Data IntegrationData AnalysisMicrosoft AzureContinuous IntegrationAWS LambdaAPIDatabasePythonSQLETL - $20 hourly
- 5.0/5
- (11 jobs)
BEng Computing Systems & Network(Hons) Graduate from University of Greenwich, London. I have been working in the industry for the past 10 years in development and network security & VOIP. Cisco TAC Engineer currently working as a Java Developer specialising in Java,Apache NiFi DevelopersAngularSpring BootVoIPJavaKerberosNetwork SecurityInternet Protocol Security - $35 hourly
- 5.0/5
- (10 jobs)
I have over 12 years of experience in the information technology industry. Worked in object oriented programming using Java and PHP programming languages. Know well relation (MySQL, PostgreSQL) and NoSQL (MongoDB, CouchDB) databases. Can manage system administrator tasks as needed on Linux based platforms. Last few years I've worked as java developer on J2EE Client-servers projects. Before that two years worked as a systems administrator in the internet service provider company.Apache NiFi DevelopersCore JavaApache TomcatJDBCExt JSApache StrutsSpring FrameworkjQueryAJAXHTMLCSSHibernateJakarta Server PagesJavaScriptSQLJava - $40 hourly
- 4.6/5
- (29 jobs)
I am a highly experienced data science freelancer with over 20+ years of experience in the field. Throughout my career, I have developed a deep understanding of the principles and techniques of data science and have applied this knowledge to a wide range of projects. With a strong background in data analysis, machine learning, deep learning, and statistical modeling, I am able to quickly and accurately extract insights from large and complex datasets. My expertise in programming languages such as Python, R, and SQL enables me to implement these insights in a scalable and efficient manner. In addition to my technical skills, I have a passion for using data science to drive business results. I have worked with a wide range of organizations, from startups to large corporations, and have helped them to use data to inform their decision-making, optimize their operations, and achieve their strategic goals. I am a self-starter and have a strong work ethic. I am able to work independently or as part of a team and I have excellent communication skills, which allow me to effectively collaborate with stakeholders at all levels of an organization. I have worked on various engagements across multiple domains for solving numerous problem statements including: Telecom: -Worked on Churn use case for the largest cellular company in the United States, analyzed their customer survey data, feedbacks, social media data to determine the customer experience and customer sentiments, demand identification and forecasting based on customer service records and customer engagement in different services, recommendation and planning of the marketing campaigns, offers and new personalized packs for customers based on customer history. Automotive: -Designed and deployed an automated application for a leading automotive testing company through which the client can see the machine failure prediction before time. The application also visualizes the predicted parameter values over time with graphs, data tables, and also the possible causes and remedies regarding the errors. -Worked on Paint shop defects analysis for the World’s largest car manufacturer BFSI: -Worked with asset management firms, investment management organizations, and developed solutions for fraud detection, fraud prediction, and credit risk analysis, stock prediction, investment planning, investment portfolio analytics. -Developed an accurate Cryptocurrency prediction model that can predict the hike and drop/crash of various coins, based on the historical data, social media data and market research data. NLP/Chatbots: -Developed Real-time Chatbot applications with . -Developed Custom NER Model for entity recognition. Logistics: -Conceptualized and implemented Route optimization algorithms for Transportation and Logistics companies, for identifying the best route and used a variety of customer data feeds and a complex optimization algorithm to compute and recommend the best route for fuel savings. Healthcare: -Developed and optimized novel deep learning-based approaches to automate many aspects of medicine, including; disease diagnosis, and preventative medicine. -Created a deep learning neural network that can process the MRI, PET-FDG, Amyloid and Tau image data from the ADNI database and developed a classification and prediction model to predict the Alzheimer's Disease.Apache NiFi DevelopersOperations ResearchArtificial IntelligenceData VisualizationData AnalysisStatistical AnalysisOptimization ModelingAWS ApplicationLarge Language ModelGenerative ModelNatural Language ProcessingDeep LearningMachine LearningChatbotPythonData Science - $20 hourly
- 5.0/5
- (17 jobs)
Experienced Software Engineer with a demonstrated history of working in the information technology and services industry. Skilled in Java, Spring Boot, DevOps, Jenkins, Ansible Eureka, React, and groovyApache NiFi DevelopersApache NiFiDockerLinuxApache Spark MLlibDevOpsAnsibleApache HadoopBig DataApache SparkElasticsearchPythonCloud ComputingJavaScriptJava - $55 hourly
- 4.8/5
- (46 jobs)
I'm looking for challenging and creative projects which allow me to focus on my true passion : coding! I'm an expert in Java development with over 10 years experience, and I've worked for some of the best software development houses in Australia. I also have several years of Scala development experience. I'm looking for additional challenges outside the usual demands of a typical 9-5 job.Apache NiFi DevelopersjQueryJ2EEJavaScriptHibernateSQLSpring FrameworkJavaScalaASP.NET MVCSpring Security - $25 hourly
- 4.9/5
- (79 jobs)
PROGRAMMING TECHNOLOGY EXPERTISE * Python ,Django, FastAPI,Flask,Selenium,Rest API * React.js, Next.js, Vue.js ,Angular * React Native * Flutter DEVOPS & CLOUD & CYBER SECURITY EXPERTISE * AWS Cloud Solution design and developmpent * Opensearch , Elasticsearch, Kibana, Logstash based setup, configuration and development integration * Ansible * Docker * Jenkins * GitLab Based CI-CD * Prometheus and grafana * SIEM * Surikata/Snort * Bro(zeek) * Hashicorp vault * Cyber Security Project Related development and consultation. * Kong api gateway integrationApache NiFi DevelopersAmazon Elastic BeanstalkFlutterReact NativePostgreSQL ProgrammingELK StackAWS CloudFrontAmazon S3RESTful APIAWS LambdaDevOpsNext.jsReactPythonDjangoAWS Amplify - $40 hourly
- 5.0/5
- (141 jobs)
Seeking for challenging task in design and development of scalable backend infrastructure solutions I work in following domains. 1. ETL Pipelines 2. Data Engieering 3. DevOps & AWS (Amazon Web Services) and GCP (Google Cloud Development) deployment 4. Machine Learning I mainly design all solutions in Python. I have 10+ years of experience in Python. I have extensive experience in following frameworks/libraries - Flask, Django, Pandas, Numpy, Django, PyTorch, Scrapy and many more Regarding ETL Pipelines, I mainly provide end to end data pipelines using AWS/GCP/Custom Frameworks. I have more than 7+ years of experience in this domain. I have strong command in Scrapy and have done more than 300+ crawlers till date. Regarding Data Warehousing, I have extensive experience in Google BigQuery and AWS RedShift. I have hands on experience in handling millions of data and analyze them using GCP and AWS data warehousing solutions. I have 5+ years of experience in designing Serverless Applications using AWS and GCP. In addition to this, I am hands on bunch of services on GCP and AWS Cloud and provide efficient and cost effective solution over there.Apache NiFi DevelopersData AnalysisApache SparkPySparkChatGPTGenerative AIAWS GlueGoogle Cloud PlatformBigQuerySnowflakeKubernetesDjangoDockerServerless StackPythonScrapyData ScrapingETL Pipeline Want to browse more freelancers?
Sign up
How it works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.
How do I hire a Apache NiFi Developer on Upwork?
You can hire a Apache NiFi Developer on Upwork in four simple steps:
- Create a job post tailored to your Apache NiFi Developer project scope. We’ll walk you through the process step by step.
- Browse top Apache NiFi Developer talent on Upwork and invite them to your project.
- Once the proposals start flowing in, create a shortlist of top Apache NiFi Developer profiles and interview.
- Hire the right Apache NiFi Developer for your project from Upwork, the world’s largest work marketplace.
At Upwork, we believe talent staffing should be easy.
How much does it cost to hire a Apache NiFi Developer?
Rates charged by Apache NiFi Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.
Why hire a Apache NiFi Developer on Upwork?
As the world’s work marketplace, we connect highly-skilled freelance Apache NiFi Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache NiFi Developer team you need to succeed.
Can I hire a Apache NiFi Developer within 24 hours on Upwork?
Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache NiFi Developer proposals within 24 hours of posting a job description.