Hire the best Apache Tapestry developers

Check out Apache Tapestry developers with the skills you need for your next job.
Clients rate Apache Tapestry developers
Rating is 4.8 out of 5.
4.8/5
based on 775 client reviews
  • $130 hourly
    AWS RDS | MySQL | MariaDB | Percona | Semarchy xDM | AWS Glue | PySpark | dbt | SQL Development | Disaster Recovery | Business Continuity | ETL Development | Data Governance / Master Data Management | Data Quality Assessments | Appsheet | Looker Studio | Percona PMM *** Please see my portfolio below.*** I have over two decades of experience immersed in a variety of data systems oriented roles on both cloud-based and on-premise platforms. Throughout my career, I have served in senior-level roles as Data Architect, Data Engineer, Database Administrator, and Director of IT. My technology and platform specialties are diverse, including but not limited to AWS RDS, MySQL, MariaDB, Redshift, Percona XtraDB Cluster, PostgreSQL, Semarchy xDM, Apache Spark/PySpark, AWS Glue, Airflow, dbt, Amazon AWS, Hadoop/HDFS, Linux (Ubuntu, Red Hat). My Services Include: Business Continuity, High Availability, Disaster Recovery: Ensuring minimal downtime of mission-critical databases by utilizing database replication, clustering, and backup testing and validation. Performance Tuning: I can analyze the database configuration, errors and events, physical resources, physical table design, and SQL queries to address performance issues. Infrastructure Engineering: In the AWS environment I use a combination of Ansible, Python with the boto3 SDK, as well as the command line interface (CLI) to create and manage a variety of AWS services including EC2, RDS, S3, and more. System Monitoring: Maintaining historical performance metrics can be useful for proactive capacity planning, immediate outage detection, alerting, and analysis for optimization. I can use tools including Percona Monitoring & Management (PMM), and AWS tools such as Performance Insights and CloudWatch. ETL Development: I develop data processing pipelines using Python, Apache Spark/PySpark, and dbt. For process orchestration, I utilize AWS Glue or Airflow. I am experienced in integrating a variety of sources including AWS S3, REST API's, and all major relational databases. Data Governance / Master Data Management: I am experienced in all phases of development and adminstration on the Semarchy xDM Master Data Management Platform. - Building the infrastructure and installing the software in AWS. - Entity design. - Developing the UI components for use by the data stewards to view and manage master data. - Creating the internal procedures for data enrichment, validation, and duplicate consolidation. - Data ingestion (ETL) - Dashboard creation.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Database Management
    Looker Studio
    Data Lake
    Apache Airflow
    AWS Glue
    PySpark
    Amazon RDS
    dbt
    System Monitoring
    Master Data Management
    High Availability and Disaster Recovery
    MySQL
    MariaDB
    Database Administration
    SQL Programming
  • $50 hourly
    A Backend Software Engineering with more than 6 years of experience. Have worked with large-scale backend/distributed systems and big data systems. A DevOps engineer with 4 years of experience - both on-premises and AWS, experienced with K8s, Terraform, Ansible, CI/CD. Currently working as Principal Engineer/ Solution Architect role.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Architectural Design
    GraphQL
    Serverless Computing
    Amazon Web Services
    DevOps
    API Development
    Elasticsearch
    Apache Kafka
    Scala
    Apache Spark
    Docker
    Apache Hadoop
    Kubernetes
  • $40 hourly
    Hi, I am Isha Taneja, highly skilled in Data Analytics, Engineering & Cloud Computing from Mohali, India. I am an expert in creating an ETL data flow in Talend Studio, Databricks & Python using best design patterns and practices to integrate data from multiple data sources.  I have worked on multiple projects which require Data migration Data Warehousing development and API Integration. Expertise: 1. Migration - Platform Migration - Legacy ETL to Modern Data Pipeline / Talend / ERP Migration / CRM Migration - Data Migration - Salesforce migration / Hubspot Migration / Cloud Migration / ERP Migration 2. Data Analytics - Data Lake Consulting - Data Warehouse Consulting - Data Modelling / Data Integration / Data Governance / ETL - Data Strategy - Data Compliance / Data Deduplication / Data Reconciliation / Customized Data - Processing Framework / Data Streaming / API implementation / Data Ops - Business Intelligence - Digital marketing Analysis / E-commerce Analytics / ERP Reporting Capabilities - Big Data - Lakehouse Implementation 3. Software QA & Testing 4. Custom Application Development - UI/UX - Frontend Development - Backend Development 5. Cloud - Cloud-native Services/ AWS Consulting / Cloud Migration /Azure Consulting / Databricks /Salesforce 6. Business process automation - Bi-directional sync between applications / RPA A Data Professional and a ETL Developer with 10+ years of experience working with enterprises/clients globally to define their implementation approach with the right Data platform strategy, Data Analytics, and business intelligence solutions. My domain expertise lies in E-Commerce, Healthcare, HR Related, Media & Advertising, Digital Marketing You have the data? Great !! I can help you analyze it using Python. It involves performing exploratory data analysis, hypothesis testing, and data visualization. You have Big Data? Even Better !! I can help you clean, transform, store and analyze it using big data technologies and production it using cloud services like AWS and Azure Cloud. You want to track business KPIs and metrics? No Problem !! I can even help you develop reports using Tableau and PowerBI; this will always keep you ahead in your business. Specialities: Databases: Snowflakes, Postgres, Dynamo DB, Graph DB - Neo4j, Mongo DB, Data Warehouse concepts, MSSQL ETL-Tools: Talend Data Integration Suite, Matillion, Informatica, Databricks API Integration - Salesforce / Google Adwords / Google Analytics / Marketo / Amazon MWS - Seller Central / Shopify / Hubspot / FreshDesk / Xero Programming: Java, SQL, HTML, Unix, Python, Node JS, React JS Reporting Tools: Yellowfin BI, Tableau, Power BI, SAP BO, Sisense, Google Data Studio AWS Platform: S3, AWS Lambda, AWS Batch, ECS, EC2, Athena, AWS Glue, AWS Step Functions Azure Cloud Platform. Other Tools: Airflow Expect Integrity, Excellent communication in English, technical proficiency, and long-term support.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Databricks MLflow
    Databricks Platform
    Tableau
    Microsoft Power BI
    Data Extraction
    Talend Data Integration
    Data Analysis
    Microsoft Azure
    Continuous Integration
    AWS Lambda
    API
    Database
    Python
    SQL
    ETL
  • $25 hourly
    Hi Muaaz is a Senior Software Developer with 7 years of experience, He offer market-leading AI-based automation solutions tailored to meet the unique needs of businesses. With expertise in web development, mobile development, data science, NLP, data analytics, desktop app development, and IoT, he has the skills to deliver comprehensive solutions that streamline operations and drive growth. He has a strong expertise in web development using modern MERN tech stack. He has hands-on experience in developing web-based systems for clients, ranging from static websites to complex web applications. He has worked on both front-end and back-end development, using technologies such as Angular, React JS, Next JS, jQuery, HTML, CSS, JavaScript, Node JS, Express JS, Nest JS, Laravel, Flask, Fast API, Django, .Net Core, Asp.NET, Firebase, and AWS. He has also integrated various APIs, such as Socket.io, Web sockets, Auth0, Passport, Stripe, E-Mail, VoIP, Swagger, D3.js, bcrypt, rxjs, sentry, and Twilio/FreePBX. His expertise in web development includes serverless architecture, database architecture, and deployment to cloud services like AWS, GCP, and Azure. In addition to his expertise in machine learning, including classification, computer vision, natural language processing, prediction/forecasting, clustering, and optimization, he also has experience in ML infrastructure engineering and DevOps. His proficiency in languages like Python, C++, C, Java, and more, as well as his experience with tools like TensorFlow, PyTorch, OpenCV, and PowerBI, make him a valuable addition to any team. With a comprehensive skill set and a focus on delivering tailored solutions that meet the unique needs of each client, he is confident that he can provide the support your business needs to succeed.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Embedded System
    Deep Learning
    Machine Learning
    Microcontroller Programming
    Microsoft Power BI
    Data Visualization
    Shopify
    Electronic Circuit Design
    Computer Vision
    Mobile App Development
    Embedded C
    WordPress
    Node.js
    React
    Angular
    WordPress Plugin
    Web Application
    Laravel
    Python
    .NET Framework
  • $45 hourly
    Core Expertise: Python, SQL, PySpark, AWS, Azure, Data Modelling, Data Engineering, API Development, Power BI AWS: S3, Lambda, Step Functions, EMR, Managed Kinesis, Kinesis Data Streams/Firehoses, Redshift, DynamoDB, API Gateway Azure: Fabric, OneLake, Kusto Tables, Event Hubs, Function Apps, Logic Apps, Databricks, Delta Tables, Data Factory, Synapse API Development: FastAPI, Django, DRF Profile: Data engineer and cloud developer with proven ability of delivering end-to-end scalable, robust, optimised, and maintainable data platform solutions. Experienced in big data ETL/ELT pipelines development and optimization, hands-on experience with AWS and Azure data stack, CI/CD. Certified Azure Data Engineer. Excellent analytical & system design skills, love to work with people. Clean and Tested Code fanatic
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Microsoft Azure
    CI/CD
    NoSQL Database
    Data Warehousing & ETL Software
    AWS Lambda
    Big Data
    API Development
    Amazon Redshift
    Data Modeling
    Amazon Web Services
    Apache Kafka
    Apache Spark
    Docker
    SQL
    Python
  • $75 hourly
    I am a passionate programmer with good communication skills who is looking for a project to join as Tech lead/Senior Software Engineer. I value developing and maintaining strong relationships. Thought I have good Java background my current tech stack is Scala, Sick, scala-cats, cats-effect, ZIO, HTTP4S, playframework, sbt. I am a fan of functional programming. I've worked as a senior software developer, team leader and CTO for both small startups and large enterprises. I have a computer science background (algorithms, data structure, networking, design patterns), experience with Unix operating systems, basic administration skills, knowledge of SQL, etc. For the last 12 years, I have been working as a full stack web developer. I am a skilled JVM programmer (using languages such as Java, Scala, and Groovy for backend development). Spring stack and Java EE stack are my favorites for writing Java web apps, though lately I work mainly with typelevel stack (Scala, Sick, scala-cats, cats-effect, playframework) . I also use Javascript and have experience with angularJS, jQuery, HTML, CSS, and more. I am interested in a TDD/BDD approach and think it's the right way to develop apps. I prefer JUnit, Hamcrest, and JMockit for writing unit & integrations tests. Frontend libraries: AngularJs, VueJs, ReactJs, Redux, nextjs,, redux-saga, Next.js, material UI, bootstrap, buefy, bulma, remirror, unifiedjs Backend Libraries and Frameworks: Spring Framework, cats, shapeless, slick, Spring Boot, Spring Security, JPA, Spring Data JPA, Hibernate, OpenJPA, GWT, JDBC, JUnit, Hamcrest, JMockit, Play Framework, Silhouette, scala-test, akka, less, SASS, Antlr, Jasmine, scala-cats Languages: Java, Scala, Elm, Groovy, PHP, Javascript, HTML, SQL IDEs and Tools: Nginx, Apache, SBT, Maven, Gradle, Git, Jira, IntelliJ IDEA, Jenkins, Jetty, Tomcat, datatables, heroku, slick Platforms: Shopify, Stripe, HubSpot Devops: terraform, kubernetes, k8s, docker, skaffold, telepresence, AWS, Heroku, bash, patroni, postgres-operator, github actions
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    AWS CloudFormation
    Google App Engine
    Haskell
    Functional Programming
    Database Design
    Redis
    PostgreSQL
    SQL
    Scala
    Java
    JavaScript
    AngularJS
  • $100 hourly
    With expertise in both big data technologies and blockchain development, I bring a unique blend of skills to tackle complex projects efficiently and effectively. With a solid background in Computer Vision, Big Data, Data Science, and Smart Contract Development, I have consistently delivered high-quality solutions that drive innovation and operational excellence. Key Skills: - Big Data & Data Science: Proficient in Hadoop, Spark, Kafka, and various data-processing tools. Skilled in implementing and optimizing data ingestion pipelines, real-time streaming applications, and data analysis platforms. - Blockchain Development: Expertise in Solidity for Ethereum, Rust for NEAR and Solana, and smart contract integration using Web3.js and Thirdweb. Successfully ported blockchain SDKs to Unity3D, enabling blockchain functionalities in gaming environments. - Cloud Computing: Extensive experience with Google Cloud, AWS, and Vercel, ensuring seamless deployment and scalability of applications. - Programming Languages: Strong command of Python, Scala, C++, Rust, C#, and JavaScript, enabling me to choose the right tool for any task. - Web Development: Skilled in React, NodeJS, and PHP for developing responsive and user-friendly web interfaces.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Cryptocurrency
    Rust
    Node.js
    Google Cloud Platform
    Big Data
    Machine Learning
    Data Science
    Data Science Consultation
    Deep Learning Modeling
    Python
    Computer Vision
    TensorFlow
    Scala
    C++
  • $500 hourly
    I'm Web Scraping Expert, Python Developer and the Founder of Scrapeak.com (Providing professional web scraping/crawling services). I've rich experience. I've provided the best solutions to numerous startups, individuals and companies around the world. To date, I've developed web scrapers for over +2000 websites/resources across various industries (Social Media Platforms, Retail Stores, E-commerce marketplaces, Real estate, Mobil Apps and more...) IP rotation, proxy network, javascript rendering, private API's etc. I have expertise in such matters. I can INTEGRATE, AUTOMATE, MANAGE and CUSTOMIZE as you wish. I mostly use Python, Requests, Scrapy, Beautifulsoup, Selenium, Lxml ✅ Customer Success ✅ Good communicating ✅ Best price & Best quality
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Selenium
    Scrapy
    Data Extraction
    Python
    Python-Requests
    Beautiful Soup
    Selenium WebDriver
    Automation
    JSON API
    SQLite
    API
    Scripting
    Web Crawling
    Web Development
  • $40 hourly
    I am a very responsible, skilled and result-oriented software engineer. - Area of Expertise * Python & Django * C, C++ * JS (AngularJS, JQuery, Ajax) * Hybrid Mobile Application(Android, iOS App(iPhone, iPad) , Cordova, Ionic) * Database (MySQL PostgreSQL) * AWS services Cross-platform apps have been my focus for the past few years. Building a cross-platform app is the best way for startups and lean companies to get their app running. The app can get built faster, requires only a single developer (or small team), and is exactly as powerful as native apps (except for games). I've been building apps exclusively through the Cordova platform for years, as can be seen in my portfolio overview. I have even more experience in building scalable backend services (Django, MySQL) to power those apps in a way that they can quickly grow without having to re-do the work once the app reaches a significant user mass. I have experience in deploying Cordova apps to the app store and play store, deploying Django applications to servers, Building front-end apps using AngularJS and ReactJS. With those skills, I'm able to offer an all-inclusive (App, backend platforms, admin tools) package to get your idea up and running while you can focus on marketing and new features! --- More Flexible Every field of technology and trends --- Thanks a million. Regards. Naveen
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    ActionScript
    Apache Cordova
    C
    Kubernetes
    Ansible
    Ionic Framework
    React
    CSS
    JavaScript
    C++
    Docker
    Python
    AngularJS
    Django
  • $25 hourly
    Hi, I am the Founder and CEO of a Mobile App Design and Development Agency named App-Knit. We specialise in providing trustworthy and reliable service for your mobile app needs. We have worked on more than 150 Android and iOS mobile app projects in the last 4 years. The mobile apps that I have delivered have an average customer rating of 4.5 and above (on a scale of 5). Hence, User Experience (UX) and design are at the heart of all the work we do. Our clients love me for always asking questions to fully understand their mobile app needs, ensuring that we create betters apps. Quality is my top priority, and so is keeping deadlines. I am always honest, so don't expect me to lie to you or to your customers. I believe successful collaboration depends on truth and effective communication. I don’t offer one-size-fits-all solutions. Once you contact me, based on the details you’ve provided, I’ll get back to you with some questions or a price quote. Get in touch today to discuss how I can help you!
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    User Experience Design
    iPad App Development
    Mobile UI Design
    User Interface Design
    Mobile App Development
    Android App Development
    iOS Development
    Flutter
    React Native
    React
  • $35 hourly
    I'm Fusion Zhu, with over 10 years of experience in Java development, including 5 years focusing on Big Data Processing and Visualisation using Java, Scala, JavaScript, HTML5, Apache Spark, Apache Hadoop, Apache Hive, Apache Flume, Apache Hbase, Storm, Kafka, DataX, and ECharts. Throughout my career: I've assisted employers in data ingestion from various sources such as RDBMS, NoSQL databases, and files by developing utilities on OSS platforms like DataXServer (open-source on GitHub) and Realtime Page Click Statistical System (refer to Portfolio section) as a Big Data Developer. I've played a key role in building Big Data Platforms using technologies like Hadoop, Spark, Hive, HBase, Flink, Kafka, and ElasticSearch as a Big Data Architect. I've designed and developed Web Applications including e-commerce and Report Systems using Java, Scala, HTML5, JavaScript, CSS, Spring, Akka, Mybatis, D3JS, ExtJS, JQuery, ReactJs, ECharts, and Bootstrap CSS as a Java & Front-end Developer. I've managed full-stack teams (Java, Front-end, QA, and Operation) effectively as a Team Leader. Furthermore, I possess extensive skills and experience in Microservice design & architecture, Container Cloud (Docker, Kubernetes), Rust, and Linux. If you're seeking a reputable and reliable professional who consistently delivers results, I'm the one you're looking for. Thank you for visiting my profile, and I look forward to hearing from you!
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    React
    Java
    JavaScript
    Scala
    Elasticsearch
    Web Development
    Docker
    OpenLayers
    D3.js
    Rust
    Spring Boot
    Apache Flink
    Apache Kafka
    Apache Spark
    Apache Hadoop
  • $70 hourly
    ⭐️⭐️⭐️⭐️⭐️ "Vikas was very responsive, understood, and respected the limited PoC scope for GPT+LangChain integration. He delivered viable v.1 of PoC within budget. He took time to explain the code in a way I could understand what work he completed. I will use Vikas again." "Would hire Vikas over and over for our work. Great worker! Highly, HIGHLY recommend." ✅ HIGHEST RATED GPT-4o, RAG, OpenAI, Langchain, LlamaIndex, Machine Learning, Computer Vision, Deep Learning, NLP, MLOps Freelancer on Upwork, with specialization in building and deploying robust, low latency, highly available models ✅ EXPERIENCED PYTHON Scripting Engineer with specialization in different cloud services like AWS, GCP, Salesforce, Azure I have solved challenging problems for some of the most innovative startups in AI/ML space, including the following to illustrate some of them: ✅ Deci AI ✅ Aporia ✅ PyImageSearch 🌟 WHY CHOOSE ME OVER OTHER FREELANCERS? 🌟 ✅ Client Reviews: I center around offering value to the entirety of my Clients and Earning their TRUST. ✅ Responsiveness: Being amazingly responsive and keeping all lines of communication promptly open with my Clients ✅ On-Time Delivery: I ensure that I deliver quality work before the client's expectations without any hiccups 🏆Client Reviews🏆 beneath portrays the nature of work and worth that you can anticipate by working with me: "Despite the huge timezone difference, Vikas managed always to be available. He was easily reachable throughout the process and delivered what we expected on time. Great communication skills. I enjoyed working with Vikas; we recommend him and will surely reach out to him again for projects in his skillset." My Skills and tools I am experienced in: - Python, JavaScript, C++ - Mongo DB, Express js, React js, Node js - OpenAI, Cohere, Gemini, AssemblyAI - KNN, Decision Trees, Random Forest, SVM, PCA, Ensemble modeling - Tensorflow, Keras, Pytorch, FastAI, OpenCV, HuggingFace, Transformers, BERT, GPT-4o, Whisper, LLaMA 3 - MLOps, DevOps - Rest APIs, Flask, FastAPI, Docker, Kubernetes - AWS EC2, SQS, LAMBDA, API Gateway, Fargate, CloudFormation, Datorama
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Midjourney AI
    Stable Diffusion
    DevOps
    Amazon Web Services
    AWS Lambda
    Amazon EC2
    Deep Learning
    Computer Vision
    Amazon SageMaker
    Python
    PyTorch
    Chatbot Development
    GPT-4o
    Azure OpenAI Service
    OpenAI API
  • $500 hourly
    I excel at analyzing and manipulating data, from megabytes to petabytes, to help you complete your task or gain a competitive edge. My first and only language is English. My favorite tools: Tableau, Alteryx, Spark (EMR & Databricks), Presto, Nginx/Openresty, Snowflake and any Amazon Web Services tool/service (S3, Athena, Glue, RDS/Aurora, Redshift Spectrum). I have these third-party certifications: - Alteryx Advanced Certified - Amazon Web Services (AWS) Certified Solutions Architect - Professional - Amazon Web Services (AWS) Certified Big Data - Specialty - Amazon Web Services (AWS) Certified Advanced Networking - Specialty - Amazon Web Services (AWS) Certified Machine Learning - Specialty - Databricks Certified Developer:
 Apache Spark™ 2.X - Tableau Desktop Qualified Associate I'm looking for one-time and ongoing projects. I especially enjoy working with large datasets in the finance, healthcare, ad tech, and business operations industries. I possess a combination of analytic, machine learning, data mining, statistical skills, and experience with algorithms and software development/authoring code. Perhaps the most important skill I possess is the ability to explain the significance of data in a way that others can easily understand. Types of work I do: - Consulting: How to solve a problem without actually solving it. - Doing: Solving your problem based on your existing understanding of how to solve it. - Concept: Exploring how to get the result you are interested in. - Research: Finding out what is possible, given a limited scope (time, money) and your resources. - Validation: Guiding your existing or new team is going to solve your problem. My development environment: I generally use a dual computer-quad-monitor setup to access my various virtualized environments over my office fiber connection. This allows me to use any os needed (mac/windows */*nix) and also to rent any AWS hardware needed for faster project execution time and to simulate clients' production environments as needed. I also have all tools installed in the environments which make the most sense. I'm authorized to work in the USA. I can provide signed nondisclosure, noncompete and invention assignment agreements above and beyond the Upwork terms if needed. However, I prefer to use the pre-written Optional Service Contract Terms www [dot] upwork [dot] com/legal#optional-service-contract-terms.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    CI/CD
    Systems Engineering
    Google Cloud Platform
    DevOps
    BigQuery
    Amazon Web Services
    Web Service
    Amazon Redshift
    ETL
    Docker
    Predictive Analytics
    Data Science
    Apache Spark
    SQL
    Tableau
  • $40 hourly
    I am a developer focused on providing highly efficient software solutions. - Full Stack Developer - Data Scientist
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Apache Spark
    Cloudera
    CakePHP
    Apache HBase
    Apache Hadoop
    Laravel
    Python
    PHP
    MongoDB
    JavaScript
  • $100 hourly
    I have over 4 years of experience in Data Engineering (especially using Spark and pySpark to gain value from massive amounts of data). I worked with analysts and data scientists by conducting workshops on working in Hadoop/Spark and resolving their issues with big data ecosystem. I also have experience on Hadoop maintenace and building ETL, especially between Hadoop and Kafka. You can find my profile on stackoverflow (link in Portfolio section) - I help mostly in spark and pyspark tagged questions.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    MongoDB
    Data Warehousing
    Data Scraping
    ETL
    Data Visualization
    PySpark
    Python
    Data Migration
    Apache Airflow
    Apache Spark
    Apache Kafka
    Apache Hadoop
  • $40 hourly
    Successful delivery of 10+ complex client-facing projects and exposure in the Telecom, Retail, Automobile, and Banking industries with a focus on data, analytics, and development of the right analytical and consulting skills to deliver in any challenging environment. Strong track record in Data Engineering with hands-on experience in successfully delivering challenging implementations I offer data services and implementation to set up Data Warehouses and Data solutions for analytics and development in retail, telecom, fintech, automobile, etc. I am a software and data developer. I earned a Bachelor's degree in computer science and have 10+ years of experience in Data Engineering and Cloud infrastructure. Tech Stack: * Snowflake * Azure Data Factory * Microsoft Fabrics * Teradata (certified) * Informatica (certified) * WhereScape RED * Airflow * AWS Athena and EC2 * Python, Pandas & Numpy * Data Warehousing (certified) * Data Scrapping, Data Mining * Data Modeling * Netezza, DB2 * Oracle PL\SQL * C# .NET * Automation * SQL & NoSQL databases
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    PDF Conversion
    Web Crawling
    Data Integration
    Data Vault
    Python
    Informatica
    API
    Snowflake
    Data Warehousing
    Database Management
    ETL Pipeline
    Apache Airflow
    MySQL
  • $55 hourly
    I focus on data engineering, software engineering, ETL/ELT, SQL reporting, high-volume data flows, and development of robust APIs using Java and Scala. I prioritize three key elements: reliability, efficiency, and simplicity. I hold a Bachelor's degree in Information Systems from Pontifícia Universidade Católica do Rio Grande do Sul as well as graduate degrees in Software Engineering from Infnet/FGV and Data Science (Big Data) from IGTI. In addition to my academic qualifications I have acquired a set of certifications: - Databricks Certified Data Engineer Professional - AWS Certified Solutions Architect – Associate - Databricks Certified Associate Developer for Apache Spark 3.0 - AWS Certified Cloud Practitioner - Databricks Certified Data Engineer Associate - Academy Accreditation - Databricks Lakehouse Fundamentals - Microsoft Certified: Azure Data Engineer Associate - Microsoft Certified: DP-200 Implementing an Azure Data Solution - Microsoft Certified: DP-201 Designing an Azure Data Solution - Microsoft Certified: Azure Data Fundamentals - Microsoft Certified: Azure Fundamentals - Cloudera CCA Spark and Hadoop Developer - Oracle Certified Professional, Java SE 6 Programmer My professional journey has been marked by a deep involvement in the world of Big Data solutions. I've fine-tuned my skills with Apache Spark, Apache Flink, Hadoop, and a range of associated technologies such as HBase, Cassandra, MongoDB, Ignite, MapReduce, Apache Pig, Apache Crunch and RHadoop. Initially, I worked extensively with on-premise environments but over the past five years my focus has shifted predominantly to cloud based platforms. I've dedicated over two years to mastering Azure and I’m currently immersed in AWS. I have a great experience with Linux environments as well as strong knowledge in programming languages like Scala (8+ years) and Java (15+ years). In my earlier career phases, I had experience working with Java web applications and Java EE applications, primarily leveraging the WebLogic application server and databases like SQL Server, MySQL, and Oracle.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Scala
    Apache Solr
    Apache Kafka
    Apache Spark
    Bash Programming
    Elasticsearch
    Java
    Progress Chef
    Apache Flink
    Apache HBase
    Apache Hadoop
    MapReduce
    MongoDB
    Docker
  • $30 hourly
    Welcome! I’m Akhtar, an AWS Solution Architect and full-stack dev with over 7 years of experience. My expertise lie in full-stack development (MERN), Python, and advanced AWS cloud solutions. Why collaborate with Me? ✅ Technical Expertise: Advanced proficiency in Python (Django, Flask, FastAPI), JavaScript, MERN (MongoDB, Express.js, React, Node.js) stack, and AWS cloud services ✅ Full-Stack Development: Proven record of delivering dynamic, responsive web applications from concept to deployment ✅ Cloud Mastery & Architectural Prowess: Expert in building scalable, cost-effective serverless architectures and containerized solutions ✅ Security and DevOps: I integrate security best practices and CI/CD pipelines to enhance development efficiency and safety 🥇 Value Proposition: ➤ Full-Stack Development: - With me you get a master of both backend and frontend technologies. This enables me to deliver complete web applications from conception to deployment, ensuring consistency and high performance across the MERN and Python stack From conceptualizing an idea in Python to integrating frontend intricacies using Javascript and full-stack capabilities with MERN - Expertise in leveraging the power of AWS to provide scalable and cost-effective solutions. Because of my AWS Certified background, I always ensure optimized architectures for swift response times and efficient operations - Quality Assurance: Being a detail-oriented person, rigorous testing protocols are a standard part of my workflow, guaranteeing high-quality outcomes 🤝 Effective Collaboration: I firmly believe that open communication and mutual respect form the bedrock of successful projects. Understanding your vision and goals while maintaining transparency is my utmost priority. 🕛 Proven Track Record: With 𝟏𝟎,𝟎𝟎𝟎+ 𝐡𝐨𝐮𝐫𝐬 𝐥𝐨𝐠𝐠𝐞𝐝 𝐨𝐧 𝐔𝐩𝐰𝐨𝐫𝐤 and several successful projects, my experience is evidenced by my results. 💡 𝐘𝐨𝐮𝐫 𝐕𝐢𝐬𝐢𝐨𝐧, 𝐌𝐲 𝐁𝐥𝐮𝐞𝐩𝐫𝐢𝐧𝐭: Whether you’re migrating to the cloud, crafting a new digital solution, or optimizing existing architectures/code, I’m here to translate your aspirations into tangible digital solutions. Let’s connect now for a dynamic and efficient digital solution tailored to your needs!
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Tailwind CSS
    Django
    Web Development
    Amazon Web Services
    Next.js
    TypeScript
    Python
    JavaScript
    AWS Lambda
    API Integration
    API Development
    NestJS
    MongoDB
    Node.js
    React
  • $35 hourly
    Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data Catalog
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    SQL
    AWS Glue
    PySpark
    Apache Cassandra
    ETL Pipeline
    Apache Hive
    Apache NiFi
    Apache Kafka
    Big Data
    Apache Hadoop
    Scala
    Apache Spark
  • $200 hourly
    A full-stack engineer with a background that lends itself to helping companies stay lean and connected whilst scaling up their customers and services. With 7 year's experience in providing DevOps solutions and services to Finance, Web3.0 and Data Analytics - I have been heavily involved in building scalable platforms for microservices on Kubernetes, migrating infrastructure and services to the cloud and creating build environments for growing teams of developers. Skills: Cloud migrations - AWS, Azure, GCP, DigitalOcean, on-premise, Hetzner Container orchestration - Kubernetes (k8s) Rancher, Docker Swarm, OpenShift Infra-as-code - Pulumi, Terraform, CloudFormation, Sceptre Continuous delivery/integration - Jenkins, DroneIO, Helm, Kubernetes, GoCD, Tilt, Earthly Database - Elasticsearch, MongoDB, MySQL, MSSSQL, Postgres Applications - Docker, Nginx, LAMP, CoreOS, Terraform, Tableau, MS Exchange, Nutanix, VMWare Horizon/vCenter, Kafka, Atlassian Jira/Confluence, Microsoft SQL Server, Microsoft Exchange CloudFormation, Hugo Networking: DNS, DHCP, VLANs, NAT, Cisco Switch/Firewall Languages: Strong - PowerShell, Bash, Python, YAML, JSON Intermediate - GoLang, NodeJS, JavaScript, HTML, CSS, C#, TSQL Basic - Haskel, OCaml, Rust Achievements (in the last 2 years): - Re-engineered SAAS architecture - migrating all production to microservices on Kubernetes reducing the company's total software expenditure by 40% - Developed Terraform templates to make an automated multi-cloud disaster recovery solution - Implemented build pipelines to allow developers to work with isolated and identical versions of dev, test, and prod - Product Owner and Scrum Master of an Agile software development project for iPhone app - Advocate the need for a transparent business vision by employing OKRs and helped to align cascading team OKRs down through the organisation - Automated the provisioning of on-premise Kubernetes clusters and build pipelines using Matchbox, Bash and Helm templating Qualifications and education: 2020 - Kubernetes Certified Applications Developer 2020 - Kubernetes Certified Administrator 2018 - AWS Certified Developer Associate 2018 - Agile Certified Practitioner 2008 - 1st class degree in Electronic Engineering and Cybernetics Please get in touch to if you think my background can be helpful to you.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Apache Spark
    Grafana
    Amazon ECS
    Kubernetes
    Docker Compose
    Amazon ECS for Kubernetes
    Continuous Integration
    Docker
    Jenkins
    Amazon Web Services
    DevOps
    Terraform
    Microsoft Azure
  • $40 hourly
    🔍🚀 Welcome to a world of data-driven excellence! 🌐📊 Greetings, fellow professionals! I am thrilled to introduce myself as a dedicated Data Consultant / Engineer, leveraging years of honed expertise across a diverse spectrum of data stacks 🌍. My journey has been enriched by a wealth of experience, empowering me with a comprehensive skill set that spans Warehousing📦, ETL⚙, Analytics📈, and Cloud Services☁. Having earned the esteemed title of GCP Certified Professional Data Engineer 🛠, I am your partner in navigating the complex data landscape. My mission is to unearth actionable insights from raw data, shaping it into a strategic asset that fuels growth and innovation. With a deep-rooted passion for transforming data into valuable solutions, I am committed to crafting intelligent strategies that empower businesses to flourish. Let's embark on a collaborative journey to unlock the full potential of your data. Whether it's architecting robust data pipelines ⛓, optimizing storage solutions 🗃, or designing analytics frameworks 📊, I am dedicated to delivering excellence that transcends expectations. Reach out to me, and together, let's sculpt a future where data powers success. Thanks!
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    PySpark
    Machine Learning
    Natural Language Processing
    Informatica
    Data Science
    Data Warehousing
    Snowflake
    Data Analysis
    Big Data
    BigQuery
    ETL
    Apache Airflow
    Apache Hadoop
    Apache Spark
    Databricks Platform
    Python
    Apache Hive
  • $25 hourly
    Around 5 years’ of experience in Data Engineering with diversified tools and technologies.  Experienced in transforming raw data into meaningful insights, ensuring data quality and integrity, and optimizing data processes for efficient analysis.  Knowledge & hands-on experience of working in Cloud stack such as Azure, AWS , GCP and cloud agnostic layers like (Snowflake and Databricks).  Experience in design & development of ETL jobs using SSIS, Airflow, Prefect, Nomad and Informatica.  Worked on Microsoft BI Product Family namely SSIS (SQL Server Integration Services), and SSRS (SQL Server Reporting Services).  Excellent problem-solving skills with strong technical background having the ability to meet deadlines, work under pressure and quickly master new technologies and skills.  Working experience on Agile based development models with CI/CD pipelines.  Proficient in coordinating and communicating effectively with project teams, with the ability to work both independently and collaboratively. I am very dedicated to provide Analytics Solutions to the companies and help them grow their business through extracting out meaningful information from their data. I firmly believe that through application of machine learning and data science techniques to the business nowadays can be very beneficial for its growth in this competitive materialistic market.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Data Analysis
    Google Cloud Platform
    Nomad
    Apache Airflow
    Data Management
    Apache NiFi
    Apache Impala
    Apache Hive
    Snowflake
    Big Data
    Cloudera
    Machine Learning
    Python
    SQL
    Informatica
    Apache Spark
  • $139 hourly
    AI and Machine Learning Expert: Holding a master's degree from the prestigious Moscow Institute of Physics and Technology (MIPT), one of Russia's premier institutions, I bring a robust academic foundation to the practical application of AI. As a passionate innovator in AI, I leverage the might of Deep Learning and Machine Learning models to tackle intricate challenges in Natural Language Processing (NLP) and Computer Vision. With a track record of driving projects that have increased operational efficiency by up to 40% and resulted in cost savings in excess of $1M, I empower businesses to make data-driven decisions. The rigors of my advanced education at MIPT, renowned for its rigorous academic program and innovative approach, have equipped me with a strong conceptual understanding and an ability to apply these concepts to real-world problems. This has been instrumental in my successful professional journey. My Computer Vision proficiency includes navigating through complex landscapes such as Generative Adversarial Networks, object detection in images, real-time video stream analytics, object tracking, face recognition, and human activity recognition. I am also adept at segmentation, style transfer, and image stitching, In the domain of NLP, my specialties include text sentiment classification, language modeling, Q/A models, multiword expressions, neural machine translation, text summarization, and analytics. These skills allow me to extract value from unstructured data, transforming it into actionable insights for your business. These skills have been instrumental in driving over $2M in new revenue opportunities for businesses by transforming unstructured data into actionable insights. Of particular note is my expertise with the GPT models, including the revolutionary ChatGPT. I have extensively utilized this AI-powered chatbot in various applications, from customer service to content generation. By harnessing the power of GPT's advanced language understanding, I can bring a new level of automation and precision to your projects. Implementing ChatGPT in customer service roles has resulted in improved customer satisfaction rates by 35% and savings of up to $1.5M annually. I employ Bayesian modeling approaches and Mixture Density Networks for structured tabular data, ensuring the most efficient use of your data resources. My interests lie in Computer Vision, Natural Language Processing, Deep Learning, Bayesian Modelling, Graph Theory, Algorithms and Data Structures, and Machine Learning. I have extensive experience with various technologies such as Tensorflow, Keras, Pytorch, CNN, GANs, RNN, Bidirectional LSTMs, GRU, Bayesian Modelling, XGBoost, Gradient Boosting, Decision Trees, Logistic Regression, and Random Forests. One of my works with transformer models is demonstrated by a co-authored paper available on ResearchGate: "Identification of Multiword Expressions using transformers". Let's join forces to elevate your business through data science, machine learning, or deep learning project. Together, we can navigate complex problems and uncover solutions that yield tangible results. Best Regards, Sritanu Chakraborty
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Data Visualization
    Data Mining
    Deep Learning Modeling
    Computer Vision
    TensorFlow
    Anomaly Detection
    PyTorch
    Bayesian Statistics
    deeplearn.js
    Deep Learning
    Python Scikit-Learn
    NumPy
    Python
    Data Science
    Keras
    Machine Learning
    Natural Language Processing
  • $30 hourly
    🏆 Google Certified TensorFlow Developer 🏆 AWS Certified Machine Learning - Specialty Engineer 🏆 AWS Certified Data Analytics - Specialty Engineer 🏆 GCP Certified Professional Machine Learning Engineer 7+ years of comprehensive industry experience in computer vision, Natural Language Processing (NLP), Predictive Modelling and forecasting. ➤ Generative AI Models 📍 OpenAI ( GPT - 3/4, ChatGPT, Embeddings ) 📍 GCP - Gemini, Gemma, PaLM, Embeddings, Imagen 2, MedLM 📍 Stable Diffusion - LoRA, DreamBooth 📍 Large Language Models (LLMs) - BLOOM, LLaMa 3, LLaMa 2, Falcon, Claude, Mistral ➤ Generative AI Frameworks 📍 RAG Framework - LangChain, Langfuse, LangSmith, LlamaIndex 📍LLM serving Framework - vLLM, Ray, TorchServe, Hugging Face TGI, TensorRT-LLM 📍 Vector database - Pinecone, QdrantDB, ChromaDB, Milvus, Weaviate, Redis, Pgvector, Elasticsearch 📍 Prompt Optimization - DSPy, TextGrad 📍 Deepgram, Elevenlabs, vapi.ai, ➤ ML Frameworks 📍 TensorFlow 📍 PyTorch 📍 Huggingface 📍 Keras 📍 Scikit-learn 📍 Spark ML 📍 NVIDIA DeepStream SDK Development ➤ DevOps 📍CI/CD 📍Git, Git Action 📍AWS - CodeCommit, CodeBuild, CodeDeploy, CodePipeline, CodeStar ➤ Cloud Skills 📍 AWS - SageMaker, Comprehend, Translate, Textract, Polly, Forecast, Personalize, Rekognition, Transcribe, IoT Core, IoT Greengrass, Bedrock 📍 GCP - Vertex AI, AutoML, Text-to-Speech, Speech-to-Text, Natural Language AI, Translation AI, Vision AI, Video AI, Document AI, Dialogflow, Contact Center AI, Timeseries Insights API, Recommendations AI, Dialogflow CX, GKE, Vertex AI Agent Builder, Cloud Run, Cloud Function 📍 Azure - Azure ML ➤ Sample work Applications include but are not limited to: 📍 Sales forecasting 📍 Recommendation engines 📍 Image classification 📍 Object segmentation 📍 Face recognition 📍 Object detection & object tracking 📍 Stable Diffusion Generative AI 📍 Augmented Reality 📍 Emotion analysis 📍 Video analytics and surveillance 📍 Text analysis and chatbot development 📍 Image caption generation 📍 Similar Image search engine 📍 Fine-tuning large language models (LLMs) 📍 ChatGPT API
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Artificial Intelligence
    Amazon Redshift
    AWS Glue
    Google Cloud Platform
    Amazon Web Services
    Image Processing
    Python
    Amazon SageMaker
    Computer Vision
    TensorFlow
    Machine Learning
    Google AutoML
    PyTorch
    Natural Language Processing
    Deep Learning
  • $50 hourly
    Data Engineer with a specialization on Big Data and Data Analyst, with more than 4 years of experience in the data processing field, including working on BI, ETL development, reporting, analysis and batch processing. Have been involved in projects with SQL (Oracle and MS SQL Server), MS SSIS, SSAS, Pentaho, Hadoop and more recently Python. My ambition is to keep up to date and learn worldwide used technologies such as Hive and Spark, along with Data Processing in the Cloud. My passion for data engineering has grown since university, starting with a role on Business Intelligence and developing challenging projects involving data processing, data wrangling, extracting information from different sources and developing ETLs in order to give accurate information for business analysts and stakeholders.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    SQL Server Integration Services
    AWS Lambda
    ETL
    Data Extraction
    Data Cleaning
    PySpark
    SQL
    Python
    AWS Glue
    Pentaho
  • $60 hourly
    Senior Software Engineer with 7 years of experience in functional programming, machine learning, AI & BigData. Also got front-end experience building websites and tools.
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    Functional Programming
    React
    Big Data
    Apache Kafka
    Akka
    Apache Cassandra
    Amazon DynamoDB
    Databricks Platform
    Machine Learning
    Apache Spark
    Python
    Scala
    JavaScript
  • $55 hourly
    I have more than seven years of hands-on experience in data engineering. My specialities are building data platforms and data pipelines with different sources. I'm keen to work on an end-to-end data pipeline building on AWS or GCP. I can fix your time and resource-killing data pipeline issues. Share your gig. Feel the difference. Also I have an expertise in: - Full Stack Web Application Development / Database Design / API development & Integration - DevOps/Linux Server Administration/Deployment/Migrations/Hosting - Web Automation / Scraping / Crawlers / Bots
    vsuc_fltilesrefresh_TrophyIcon Apache Tapestry Developers
    PySpark
    API
    AWS Lambda
    Amazon Web Services
    ETL Pipeline
    Apache Spark
    Python
    Scrapy
    Amazon S3
    Data Mining
    AWS Glue
    Apache Airflow
    DevOps
    Docker
    Data Migration
  • Want to browse more freelancers?
    Sign up

How it works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Tapestry Developer on Upwork?

You can hire a Apache Tapestry Developer on Upwork in four simple steps:

  • Create a job post tailored to your Apache Tapestry Developer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Tapestry Developer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Tapestry Developer profiles and interview.
  • Hire the right Apache Tapestry Developer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Tapestry Developer?

Rates charged by Apache Tapestry Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Tapestry Developer on Upwork?

As the world’s work marketplace, we connect highly-skilled freelance Apache Tapestry Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Tapestry Developer team you need to succeed.

Can I hire a Apache Tapestry Developer within 24 hours on Upwork?

Depending on availability and the quality of your job post, it’s entirely possible to sign up for Upwork and receive Apache Tapestry Developer proposals within 24 hours of posting a job description.

Schedule a call