Hire the best Apache Impala developers

Check out Apache Impala developers with the skills you need for your next job.
  • $50 hourly
    Note: Pls scroll down to the end to see my employment history โฆฟ Work: I help small to mid-sized enterprises with Data Engineering, Business Intelligence & Visualization/Reporting. I also work with education & non-profit organizations looking to convey stories in a visually and factually engaging manner. I donโ€™t believe in a one-size fits all solution for Data Management. While some companies have performance as their priority, some look for a balance of performance and costs. Similarly, some companies prefer interactive dashboards whereas some prefer tabular reports to be sent to their emails daily. I help my clients make the right choice. I consider myself fortunate to have worked with different types of clients where we have worked together to achieve the optimal BI and data visualization solution. If you are looking to set up your Business Intelligence from ground up, then this is how I can help - 1. I will conduct a data maturity assessment wherein I can assess the data landscape in your firm. The deliverable will be a detailed current state report with recommendations. 2. I will then propose a detailed plan that lays out the ideal solution for your data and reporting landscape. 3. I will then work on the implementation. If you are looking to build dashboards for smaller initiatives in hand, I can build some professional dashboards with the right combination of KPIs and metrics. If it is a non-profit research based storytelling, I can develop infographics and dashboards that will effectively convey the story. After completion of a project, I will provide detailed documentation to enable you to continue maintenance of your dashboards and/or data products (including scripts, code, etc.). Lastly, if you are a product owner in the data and Analytics space looking to promote your product, I can work on proof of concepts to help you promote your data tool. โฆฟ My Skills: Data Visualization/Reporting Tools: Tableau, SSRS, Cognos 10.2, Cognos 11.1 Analytics,Google Data Studio, Grafana Databases: SQL Server, mySQL, postgreSQL, AWS Redshift Cloud Technologies: AWS AppFlow, AWS S3, AWS Lambda Data Engineering (ETL Development) Tools: SSIS, T-SQL, Cognos Framework Manager, Cognos Transformers Coding languages: Python, R Agile Project Management Platform: Atlassian (JIRA) Source Control: Git Survey Tools: Qualtrics ------------------------------------------------------ โฆฟ Work Highlights: 1. Custom built call center analytics performance tool to reduce call volume using Python & Tableau. - Built a tool in Tableau comprising multiple dashboard views where users could decide the type of comparisons they wanted to see. - Built Python scripts to pull raw data from multiple Google Analytics properties periodically, combine data and feed the data to Tableau on a schedule. - Used AWS AppFlow and AWS S3 to export data from Salesforce on a schedule. Built Python scripts to clean this data and run multiple machine learning algorithms (Ensembles) to perform Sentiment Analysis. The Python scripts were setup to allow for constant training based on verified data. This data was then fed to Tableau on a schedule. 2. Developed Dashboards in Tableau to gauge Chatbot effectiveness - Used AWS AppFlow to gather chatbot data and store it in S3 for historical tracking. This data was then visualized for various metrics like Retention Rate, Chatbot Conversation Length, etc. Performed Hypothesis testing (Two-Sample T-Test) to determine the chatbot's launch effectiveness (i.e., to determine if the introduction of the chatbot had a significant effect on call volume reduction) 3. Developed a Chatbot in Python Used Python NLP algorithms and libraries to build a chatbot that could answer queries related to ticket booking and could retrieve booking details. 4. Developed Dashboards in Tableau for Non-profit research based storytelling Mined survey data from Google Sheets and developed dashboards in Tableau to relay the stats related to a certain social cause. 5. Data Warehousing and Reporting for a large beauty company Developed stored procedures and SQL views using SQL Server, SSIS and T-SQL for a large beauty company. The 'Revenue Recognition' logic was incorporated in the report development so that revenues could be realized only after being invoiced. Developed the necessary processes to deploy the reports to top executives to their emails. Developed reports using Cognos Analytics and SSRS. 6. Developed Predictive Analytics algorithms in R and Python across domains I have also worked on a number of Predictive Analytics use cases in R and Python wherein I have used Supervised and Unsupervised predictive analytics models. ----------------------------------------------------------------------------------- โฆฟ Availability: I am in Stamford, CT and work in any US timezone. If you have a requirement for a different timezone, we can always discuss it.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Amazon Redshift
    Amazon CloudWatch
    AWS Lambda
    Microsoft Azure SQL Database
    Data Visualization
    API Integration
    Data Scraping
    Google Analytics
    SQL Server Integration Services
    Python
    Tableau
    Looker Studio
    Microsoft Excel
    SQL
    ETL Pipeline
  • $70 hourly
    ๐—–๐—ฒ๐—ฟ๐˜๐—ถ๐—ณ๐—ถ๐—ฒ๐—ฑ ๐——๐—ฎ๐˜๐—ฎ ๐—ฃ๐—ฟ๐—ผ๐—ณ๐—ฒ๐˜€๐˜€๐—ถ๐—ผ๐—ป๐—ฎ๐—น with ๐Ÿฒ+ ๐˜†๐—ฒ๐—ฎ๐—ฟ๐˜€ of experience and hands-on expertise in Big Data, Data Engineering, Data Warehousing and Data Analytics. Looking for someone with a broad skill set, minimal oversight and ownership mentality then contact me to discuss in detail the value and strength I can bring to your company. ๐™„ ๐™๐™–๐™ซ๐™š ๐™š๐™ญ๐™ฅ๐™š๐™ง๐™ž๐™š๐™ฃ๐™˜๐™š ๐™ž๐™ฃ ๐™ฉ๐™๐™š ๐™›๐™ค๐™ก๐™ก๐™ค๐™ฌ๐™ž๐™ฃ๐™œ ๐™–๐™ง๐™š๐™–๐™จ, ๐™ฉ๐™ค๐™ค๐™ก๐™จ ๐™–๐™ฃ๐™™ ๐™ฉ๐™š๐™˜๐™๐™ฃ๐™ค๐™ก๐™ค๐™œ๐™ž๐™š๐™จ: โ–บ BIG DATA & DATA ENGINEERING Apache Spark, Hadoop, MapReduce, YARN, Pig, Hive, Kudu, HBase, Impala, Delta Lake, Oozie, NiFi, Kafka, Airflow, Kylin, Druid, Flink, Presto, Drill, Phoenix, Ambari, Ranger, Cloudera Manager, Zookeeper, Spark-Streaming, Streamsets, Snowflake โ–บ CLOUD AWS -- EC2, S3, RDS, EMR, Redshift, Lambda, VPC, DynamoDB, Athena, Kinesis, Glue GCP -- BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Data Fusion Azure -- Data Factory, Synapse. HDInsight โ–บ ANALYTICS, BI & DATA VISUALIZATION Tableau, Power BI, SSAS, SSMS, Superset, Grafana, Looker โ–บ DATABASE SQL, NoSQL, Oracle, SQL Server, MySQL, PostgreSQL, MongoDB, PL/SQL, HBase, Cassandra โ–บ OTHER SKILLS & TOOLS Docker, Kubernetes, Ansible, Pentaho, Python, Scala, Java, C, C++, C# ๐™Ž๐™ค๐™ข๐™š ๐™ค๐™› ๐™ข๐™ฎ ๐™ข๐™–๐™Ÿ๐™ค๐™ง ๐™ฅ๐™ง๐™ค๐™Ÿ๐™š๐™˜๐™ฉ๐™จ ๐™ž๐™ฃ๐™˜๐™ก๐™ช๐™™๐™š๐™™ - Designing Big Data architectures for the financial and telecom sector to power their data-driven digital transformation. - Implementing Data Lake and Data Warehousing solutions using Big Data tools. - Developing ETL workflows using Apache Spark, Apache NiFi, Streamsets, Apache Airflow, etc. - Hands-on experience with Big Data and Cloud technologies in implementation and architectural design of Data Lake and Data Warehouse. - Experienced in working with Cloudera, Hortonworks, AWS, GCP, and other Big Data and Cloud technologies. ๐™’๐™๐™š๐™ฃ ๐™ฎ๐™ค๐™ช ๐™๐™ž๐™ง๐™š ๐™ข๐™š, ๐™ฎ๐™ค๐™ช ๐™˜๐™–๐™ฃ ๐™š๐™ญ๐™ฅ๐™š๐™˜๐™ฉ: - Outstanding results and service - High-quality output on time, every time - Strong communication - Regular & ongoing updates Your complete satisfaction is what I aim for, so the job is not complete until you are satisfied! Warm Regards, ๐—”๐—ป๐—ฎ๐˜€
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Solution Architecture Consultation
    AWS Lambda
    Apache NiFi
    ETL Pipeline
    Data Management
    Data Warehousing
    AWS Glue
    Apache Spark
    Amazon Redshift
    Apache Hadoop
    ETL
    Python
    SQL
    Marketing Analytics
    Big Data
    Data Visualization
  • $50 hourly
    Hey, This is Akhil, and I'm well-versed in the latest and most innovative ELK utilization. My expertise spans a wide range of technologies, including Elasticsearch, Logstash, Kibana, Beats, Elastic Agents, AWS, Python, and more. I bring a comprehensive skill set to the table, allowing me to tackle complex challenges. My ELK technical expertise covers some of the following use cases:- โ— Implemented Seamless Website Search: Leveraging Elasticsearch search algorithms, I've successfully implemented seamless website search solutions for clients across various industries. These solutions have not only boosted user engagement but have also significantly increased content discoverability, resulting in improved website performance and user satisfaction. โ— Centralized Monitoring Setup: I've helped organizations gain real-time insights into their applications and infrastructure. My solutions enable proactive issue identification and resolution, reducing downtime and ensuring seamless operations using real-time dashboards, visualizations, and graphs. โ— Created 24/7 Alerting Mechanisms: My expertise extends to creating sophisticated 24/7 alerting mechanisms that not only monitor system health but also anticipate potential issues. This proactive approach has enabled my clients to maintain uninterrupted operations and minimize costly downtime. These have been integrated with many applications such as MS Teams, Slack, Email, Jira, Outlook etc. โ— Designed Cross-Cluster Replication: By setting up cross-cluster replication configurations, I've helped organizations ensure data redundancy and high availability. These configurations are designed to withstand unexpected failures and provide seamless data access. โ— Applied Anomaly Detection and ML/AI: I've successfully applied machine learning techniques for anomaly detection, enhancing security and forecasting in Elasticsearch. My solutions have identified irregularities in data patterns, allowing for swift and effective responses. Integrated some of the packages like spaCy, and HuggingFace for advanced natural language processing into licensed Elasticsearch. โ— Produced Business Intelligence Visuals: Using Kibana's capabilities, I've created eye-catching visualizations, maps, and dashboards. These visuals enable real-time analytics, simplifying complex data and aiding businesses in making quick, intelligent decisions to foster growth โ— Enhanced SIEM Security: I've played a important role in strengthening security for organizations through SIEM (Security Information and Event Management) solutions. This includes the creation of detection rules, proactive threat hunting, and comprehensive audit logging, ensuring robust security measures are in place to protect sensitive data and systems. โ— Provided Search Solutions for Mail Backups: I've developed search solutions tailored specifically for mail backups, allowing clients to efficiently retrieve and analyze critical old emails and attachments. These solutions have proven invaluable in legal and compliance scenarios.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    ELK Stack
    Elasticsearch
    Kibana
    Logstash
    Map Illustration
    SQL
    Dashboard
    Visualization
    Python
    Anomaly Detection
  • $50 hourly
    - One of the rare engineers who can not only code but also prepare it's documentation. - More than 5 years of experience in the tech sector, with more than 3 years as a full stack engineer. - Can take an application from conception to completion and come highly experienced in building backend using TypeScript while developing the front end using JavaScript frameworks like Next.js and React.js.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Web Development
    API
    Robot Operating System
    C++
    Systems Development
    AWS Amplify
    Python
    Data Science
    React
    JavaScript
  • $20 hourly
    Really appreciate you for visiting my profile. I am an experienced Data Engineer/Software Engineer. I'm working as Data Engineer at Sterlite technologies Ltd since April 2021 and I have 3.5 years of professional experience in Data Engineering. Also I have worked in the field of machine learning and data science for 3+ years. I have worked in several big technologies in India in the past few years. I mainly specialize in Data warehousing, ETL pipelines, Data modeling, ML models, and general engineering of apps and APIs and I am highly effective Data engineer offering an expertise in big data project well versed with the technologies like Hadoop, Apache Spark, Hive, Linux, Python, Scala, Java and Spark's Applications.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Spring Boot
    Back-End Development Framework
    Microservice
    Data Analysis
    Google Cloud Platform
    BigQuery
    MySQL
    Big Data
    Apache Spark
    Apache Kafka
    ETL Pipeline
    Java
    Machine Learning
    Apache Hadoop
    Python
  • $20 hourly
    Iโ€™m a Data Engineer expertise in Cloudera, Hortonworks, Power BI, Tableau, Microsoft Azure, Oracle Database, SQL Server, PostgreSQL etc. My charges is low because I want to build long term relationship with Clients for next projects on the basis of my performance. Following are my Services:- Big Data Problems Business Intelligence (Power Bi, Tableau etc.) Cloudera Clusters Management(Breadcrumbs, Instances, Sentry Setup, Load Balancer, Add Services, Troubleshooting and fixing of errors etc.) Ambari Cluster Management (YARN, SQOOP, HIVE, HDFS, HDP, HDF etc.) Power Bi Reporting (Azure DB, Oracle, on-Prem, MYSQL, HIVE etc.) Digital Marketing Shopify & WordPress Web Development Data Analysis Database Scheme Designing Web Development and Designing If you hire me, I'll perfectly complete your project in time. I believe client appreciation is more than $$, so please don't hesitate to contact me. If you hire me, I'll perfectly complete your project in time. Quality of work is my first Priority and 100% Client Satisfaction is my Guaranty. Feel free to contact me for any custom projects Regards, Arslan Shahid arslanshahid.com
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    CentOS
    Linux System Administration
    Tableau
    PowerBI
    Elasticsearch
    Cloudera
    ETL Pipeline
    Apache Cassandra
    Apache Hive
    Apache Spark
    Apache Hadoop
    Big Data
  • $35 hourly
    ๐Ÿ”๐Ÿš€ Welcome to a world of data-driven excellence! ๐ŸŒ๐Ÿ“Š Greetings, fellow professionals! I am thrilled to introduce myself as a dedicated Data Consultant / Engineer, leveraging years of honed expertise across a diverse spectrum of data stacks ๐ŸŒ. My journey has been enriched by a wealth of experience, empowering me with a comprehensive skill set that spans Warehousing๐Ÿ“ฆ, ETLโš™, Analytics๐Ÿ“ˆ, and Cloud Servicesโ˜. Having earned the esteemed title of GCP Certified Professional Data Engineer ๐Ÿ› , I am your partner in navigating the complex data landscape. My mission is to unearth actionable insights from raw data, shaping it into a strategic asset that fuels growth and innovation. With a deep-rooted passion for transforming data into valuable solutions, I am committed to crafting intelligent strategies that empower businesses to flourish. Let's embark on a collaborative journey to unlock the full potential of your data. Whether it's architecting robust data pipelines โ›“, optimizing storage solutions ๐Ÿ—ƒ, or designing analytics frameworks ๐Ÿ“Š, I am dedicated to delivering excellence that transcends expectations. Reach out to me, and together, let's sculpt a future where data powers success. Thanks!
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    PySpark
    Machine Learning
    Natural Language Processing
    Informatica
    Data Science
    Data Warehousing
    Snowflake
    Data Analysis
    Big Data
    BigQuery
    ETL
    Apache Airflow
    Apache Hadoop
    Apache Spark
    Databricks Platform
    Python
    Apache Hive
  • $30 hourly
    I have 10+ years of IT experience focusing on Business Intelligence (Power BI, Pentaho, Tableau, Talend, Kettle), ETL process, Data warehousing, Data modeling, Data integration, Data Migration. Big data and analytics, Machine learning, Data warehousing and mining, Business Intelligence, Software Engineering Cloud Platforms:- AWS GCP Azure Programming languages: Python, R, โ€ข Databases: MongoDB, Cassandra, H-base, and SQL โ€ข Big Data Tech: MapReduce, Spark, Kafka, MLlib, Hive, Pig โ€ข Miscellaneous: Numpy, Scikit-learn, AWS, Keras, NLTK, Flask,Selenium,Pandas,bs4 SharePoint 2010/2013/2016/2019/Online, .Net Framework. MVC, C#, Angular JS Business Intelligence Skills:- PowerBI Pentaho BI Suite Tableau Jasper Soft Crystal Reports SSAS ETL Skills:- Talend Open Studio Pentaho Kettle (PDI) SSIS Databases:- Oracle 10g/9i MS SQL Server 2005/2012 HP vertica MongoDB Postgres SQL infiniDB Amazon RedShift Spark Ignite MS Access SQL PL/SQL SQL*Plus SQL*Loader PSP TOAD I have excellent knowledge on spark and have 4+ years of working experience on spark and can write in JAVA/PYTHON and SCALA. I have strong understanding of Data Modeling (Relational, dimensional, Star and Snowflake Schema), ER Diagrams, Entities, Attributes, Cardinality, Data analysis,implementations of Data warehousing using Windows and UNIX. I have developed as well as provided consultation solution for more than 20 clients belongs from Retail,Telecom,E commerce,Health Care domains. Some of the key areas I have mastered are Cloudera Hadoop, ETL Framework setup, Scripting,Error Logging,Email Notifications,Exception Handling,SCDs,Clustering PDI,Paratitioning the ELT Jobs,Performance tuning of Jobs. โ€ข Proficiency in designing, developing and deploying end to end Big Data solutions using entire Hadoop Eco System. โ€ข Hands on experience in Hadoop Administration, Hadoop Cluster Design/Setup, HDFS and Map-Reduce Programming using Java. โ€ข Hands on experience in developing a spark application using Scala. โ€ข Amazon EC2 administration. โ€ข Experience in developing the Analytics and Business Intelligence solutions using Pentaho, JasperSoft, JSP and Servlets. โ€ข Experience in shell script development and linux commands. โ€ข Hands on experience of maven, SBT and Jenkins. I have completed more than 400+ hours online work with 97% job success. It includes 15+ satisfied clients and 8 SharePoint projects. I focus on Microsoft technologies which include SharePoint, Office 365, Power BI, Migration, Intranet, website design and development. - I have completed more than 10 projects with office 365. - Using Power BI, I have Created interactive dashboard and charts for 4 different projects. - I have completed 3 complex projects for document and content management. - I have completed 8 Intranet portals. - 2 Migration projects which include all lower version to higher version and on premises to office 365 migration. - 1 project for project management in SharePoint. - 1 project in project server online. AWS Certified Solutions Architect. Extensive experience in developing strategies and implementing solutions using AWS Cloud services, Docker Containerization, and Deployment automation. Experience in building and maintaining a cloud environment for hosting security tools and for maintaining the cloud security tools that are used to secure production clouds. Expert in the integration of security at every phase of the software development lifecycle, from initial design through integration, testing, deployment, and software delivery Good understanding of cloud costs, time to market, and potential user bases, as well as the ability to build scalable and future-proof application architectures. Setting up infrastructure monitoring tools such as Grafana and Prometheus. Build and deploy microservices using Jenkins pipelines to the Docker registry, using Kubernetes, and using Kubernetes to manage them. Ability to optimize continuous integration and troubleshoot deployment build issues using triggered logs.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Microsoft Power Automate
    .NET Framework
    MongoDB
    Pentaho
    Ionic Framework
    Microsoft PowerApps
    Talend Data Integration
    Microsoft Power BI
    Business Intelligence
    SQL Programming
    Microsoft SharePoint Development
    Database Administration
    Amazon Redshift
    ETL
    Tableau
  • $25 hourly
    Hello, My name is Mustajab and I am a highly skilled and experienced professional specializing in automation, web scraping, and data analysis and application development. With expertise in Selenium, RPA tools such as UIpath and Automation Anywhere, I excel in developing efficient and reliable automated solutions. My proficiency in Python, JavaScript, and Appscript enables me to create robust scripts and automate complex tasks. I have extensive experience in web automation, extracting data from various sources, and performing ETL (Extract, Transform, Load) operations to ensure seamless data integration. Additionally, I possess advanced skills in data analysis and visualization using tools like Power BI. I can help you derive meaningful insights from your data and create visually appealing dashboards and reports. Having worked with WordPress extensively, I am capable of building custom solutions, plugins, and themes to meet your specific requirements. I can also assist in integrating various automation components to streamline your WordPress workflows. Moreover, I have expertise in developing chatbots and automation bots for platforms like Discord. I can create intelligent bots that automate repetitive tasks, provide real-time information, and enhance user experiences. In terms of web development, I am proficient in Angular, Flask, and React frameworks, allowing me to build dynamic and responsive web applications. Whether you need a simple website or a complex web-based automation solution, I can deliver high-quality results. I am passionate about delivering efficient and reliable automation solutions tailored to your needs. Let's discuss your project requirements, and together we can streamline your processes, increase productivity, and achieve remarkable results. Contact me now to get started!"
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Web Development
    Business Intelligence
    Microsoft Power BI Data Visualization
    Microsoft Power BI
    Web Crawling
    UiPath
    Selenium WebDriver
    Data Scraping
    Data Extraction
    Software QA
    .NET Core
    Selenium
    Python
    React
    Node.js
  • $30 hourly
    I'm a Full Stack developer having expertise in Java, Spring Framework. During the past 7 years. # Areas of expertise - Single-page Applications - REST API development - WebApplication development SERVICES: โ€ข Backend : Spring โ€ข Databases : MongoDB, MySQL, PostgreSQL โ€ข AWS : Lambda, DynamoDB, DMS, API Gateway, S3, SNS, SpringBoot etc. โ€ข Tools : Git, GitHub, bug trackers, Jira 7+ years of professional experience Many complex systems developed from scratch
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    PostgreSQL
    RESTful Architecture
    SOAP
    REST
    CSS 3
    HTML5
    Hibernate
    JavaScript
    Spring Framework
    Spring Security
    Core Java
    Java
  • $60 hourly
    ๐—–๐—ฒ๐—ฟ๐˜๐—ถ๐—ณ๐—ถ๐—ฒ๐—ฑ ๐——๐—ฎ๐˜๐—ฎ ๐—ฃ๐—ฟ๐—ผ๐—ณ๐—ฒ๐˜€๐˜€๐—ถ๐—ผ๐—ป๐—ฎ๐—น big data engineer with ๐Ÿฏ+ ๐˜†๐—ฒ๐—ฎ๐—ฟ๐˜€ of experience and have been working extensively on building ETL pipelines, designing and synchronizing elegant dashboards and developing and maintaining end-to-end big data solutions for customers based of various big data tools and technologies as such Apache Spark, Hadoop, Kudu, Hive, Apache Nifi and more. ๐™„ ๐™๐™–๐™ซ๐™š ๐™š๐™ญ๐™ฅ๐™š๐™ง๐™ž๐™š๐™ฃ๐™˜๐™š ๐™ž๐™ฃ ๐™ฉ๐™๐™š ๐™›๐™ค๐™ก๐™ก๐™ค๐™ฌ๐™ž๐™ฃ๐™œ ๐™–๐™ง๐™š๐™–๐™จ, ๐™ฉ๐™ค๐™ค๐™ก๐™จ ๐™–๐™ฃ๐™™ ๐™ฉ๐™š๐™˜๐™๐™ฃ๐™ค๐™ก๐™ค๐™œ๐™ž๐™š๐™จ: โ–บ BIG DATA & DATA ENGINEERING Apache Spark, Hadoop, MapReduce, YARN, Pig, Hive, Kudu, HBase, Impala, Delta Lake, Oozie, NiFi, Kafka, Airflow, Kylin, Druid, Flink, Presto, Drill, Phoenix, Ambari, Ranger, Cloudera Manager, Zookeeper, Spark-Streaming, Streamsets, Snowflake โ–บ CLOUD AWS -- EC2, S3, RDS, EMR, Redshift, Lambda, VPC, DynamoDB, Athena, Kinesis, Glue GCP -- BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Data Fusion Azure -- Data Factory, Synapse. HDInsight โ–บ ANALYTICS, BI & DATA VISUALIZATION Tableau, Power BI, QuickSight, SSAS, SSMS, Superset, Grafana, Looker โ–บ DATABASE SQL, NoSQL, Oracle, SQL Server, MySQL, PosgreSQL, MongoDB, PL/SQL, HBase, Cassandra โ–บ OTHER SKILLS & TOOLS Docker, Kubernetes, Ansible, Pentaho, Python, Scala, Java, C, C++, C#
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    PostgreSQL
    Apache Hive
    Apache Hadoop
    Amazon Athena
    Amazon Redshift
    Amazon S3
    BigQuery
    AWS Lambda
    AWS Glue
    ETL Pipeline
    Python
    Apache Spark
    Apache NiFi
    Big Data
  • $65 hourly
    โญ || TOP RATED PLUS || โญ Experienced Data Engineering and Analytics leader with expertise in building and growing amazing teams. Expertise includes data strategy and planning, data engineering, reporting and insights. Experience with 'Big Data' cloud technologies (Spark/hadoop/hive/pig/presto/Redshift/Snowflake), traditional databases (Oracle, Teradata) and traditional BI platforms (MS BI) and visualization tools (Tableau), Data pipeline/ETL architecture, and overall business alignment for optimal data impact. Specialties Data Warehouse, Data Engineering, Data Architecture, Data Modeling, Big data batch processing, Stream processing, Data tools, Data Analytics, Data operations and Data lifecycle management, and enabling Experimentation and Machine learning. Some of the technologies that we work with are: - Power BI - Azure cloud services - SQL Server - Snowflake - EDA - Tableau - Python - BigQuery - Microsoft Excel (Macro, VBA) - Web development using PHP as well as node.js - PySpark - Looker Studio - And many more...
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Data Analytics & Visualization Software
    Microsoft Power BI
    Data Warehousing & ETL Software
    Data Modeling
    Data Scraping
    Data Mining
    Microsoft Power BI Data Visualization
    ETL
    SQL Programming
    Microsoft SQL SSAS
    Data Engineering
    ETL Pipeline
    Tableau
    Microsoft Excel
    Python
  • $25 hourly
    Dear Client, thank you for having a look at my profile, it would be my pleasure to work with you! please find below my skill set to match your requirement: Top Skills: Power bi, Data Visualization, Tableau, Google Data Studio, Python, SQL, SQL Server, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), Talend, Altrex, Informatica, Apache NIFI, Apache HIVE, Apache IMPALA, Apache Spark Experience: I have worked with different clients. I can show you when we are discussing the project. New opportunities and challenges excite me.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Amazon Web Services
    Amazon Redshift
    Microsoft Power BI Data Visualization
    Apache Hive
    Data Engineering
    Data Warehousing & ETL Software
    Apache NiFi
    Informatica
    Apache Spark
    Microsoft SQL Server
    Microsoft Power BI
    SQL
    Data Visualization
    Python
  • $30 hourly
    Big Data Consultant, ETL developer, Data Engineer providing complete Big Data solutions, including data acquisition, storage, transformation, and analysis. Design, implement and deploy ETL Pipelines. I have degree in Computer System Engineering and 3 Years of experience in Data Driven Tasks specifically in Big Data Solutions. Additionally, having business exposure of banking sector and ecommerce platform. ------------- Data Engineering Skills ------------- Expertise: Hadoop, HDFS, Unix and Linux , Data Warehousing, Data Integration, Data Reconciliation, Data Consolidation, Dimensional Modeling, Shell Scripting, Web Scraping. Tools & Libraries: Advance SQL, Spark, Scala, Python, Hadoop, Hive, SSIS, Sqoop, Hive, Impala, AWS, Advance Excel, Redshift. Database: Oracle, MS SQL Server, SQLite
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Automation
    Database Programming
    SQL Server Integration Services
    MySQL
    Amazon Web Services
    Big Data
    ETL
    Amazon Redshift
    Data Analysis
    Amazon S3
    SQL
    Python
    ETL Pipeline
    Data Integration
    Apache Spark
  • $25 hourly
    As a professional Data Engineering enthusiast with a Bachelor's degree in Computer Science, I am an expert in data extraction, modeling, reporting, and database backups and restoration processes. My technical skills include: Experienced in PySpark and Python for data manipulation and analysis Proficient in Python libraries such as Numpy, Pandas, BeautifulSoup4, requests, Pymongo, and Plotly Experienced in Hadoop Imapla for writing complex SQL queries Skilled in SQL databases such as Postgre SQL, MariaDB, SparkSQL,MySQL Server with experience in MariaDB replication and server installations Experience with NoSQL databases such as MongoDB with experience in MongoDB aggregation pipelines, MongoDB replication and server installations Proficient in MS Excel
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    MySQL
    Database Administration
    Business Intelligence
    PostgreSQL
    Data Analysis
    MongoDB
    MariaDB
    Python
    SQL
    Microsoft Excel
    Apache Spark
    Data Engineering
  • $75 hourly
    Tool-oriented data science professional with extensive experience supporting multiple clients in Hadoop and Kubernetes environments, deployed with Cloudera Hadoop on-premise and Databricks in AWS. My passion is client adoption and success, with a focus on usability. With my computer science and applied math background, I have been able to fill the gap between platform engineers and users, continuously pushing for product enhancements. As a result, I have continued to create innovative solutions for clients in an environment where use-cases continue to evolve every day. I find fulfillment in being able to drive the direction of a solution in a way that allows both client and support teams to have open lanes of communication, creating success and growth. I enjoy working in a diverse environment that pushes me to learn new things. I'm interested in working on emerging solutions as data science continues to evolve.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    R
    Serverless Stack
    React
    Apache Hadoop
    Java
    Cloudera
    AWS Lambda
    R Hadoop
    Bash Programming
    PostgreSQL
    Apache Spark
    Python
    AWS Development
    Apache Hive
  • $20 hourly
    โ€ข Having 6+ years of experience as a Hadoop/Pyspark Developer. โ€ข Having extensive knowledge on Hadoop technology experience in Storage, writing Queries, processing, and analysis of data. โ€ข Experience on migrating on Premises ETL process to Hadoop Layer. โ€ข Experience in optimizing Hive SQL queries and Spark Jobs. โ€ข Implemented various frameworks like Data Quality Analysis and Data Validation with the help of technologies like Bigdata, Spark, Python. โ€ข Primary technical skills in Pyspark, HDFS, YARN, Hive, Sqoop, Impala, Oozie. โ€ข Good exposure on advanced topics Like Analytical Functions, Indexes, Partition Tables. โ€ข Experience with creation of Technical document for Functional Requirement, Impact Analysis, Technical Design documents, Data Flow Diagram. โ€ข Quick learner and up to date with industry trends, Excellent written and oral communications, analytical and problem-solving skills and good team player, Ability to work independently and well-organized.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    PySpark
    Sqoop
    Python Script
    Apache Hadoop
    Apache Spark
    SQL
    Python
    Apache Hive
    Apache Airflow
  • $29 hourly
    *Experience* โ€ข Have hands-on experience upgrading the HDP or CDH cluster to Cloudera Data Private Cloud Platform [CDP Private Cloud]. โ€ข Extensive experience in installing, deploying, configuring, supporting, and managing Hadoop Clusters using Cloudera (CDH) Distributions and HDP hosted on Amazon web services (AWS) cloud and Microsoft Azure. โ€ข Experience in pgrading of Kafka, Airflow and CDSW โ€ข Configured various components such as HDFS, YARN, Sqoop, Flume, Kafka, HBase, Hive, Hue, Oozie, and Sentry. โ€ข Implemented Hadoop security. โ€ข Deployed production-grade Hadoop cluster and its components through Cloudera Manager/Ambari in a virtualized environment (AWS/Azure Cloud) as well as on-premises. โ€ข Configured HA for Hadoop services with backup & Disaster Recovery. โ€ข Setting Hadoop prerequisites on Linux server. โ€ข Secured the cluster using Kerberos & Sentry as well as Ranger and tls. โ€ข Experience in designing and building scalable infrastructure and platforms to collect and process very large amounts of structured and unstructured data. โ€ข Experience in adding and removing nodes, monitoring critical alerts, configuring high availability, configuring data backups, and data purging. โ€ข Cluster Management and troubleshooting on the Hadoop ecosystem. โ€ข Performance tuning, and solving Hadoop issues using CLI, CMUI by apache WebUI. โ€ข Report generation of running nodes using various benchmark operations. โ€ข Worked on AWS services such as EC2 instances, S3, Virtual private cloud, Security groups, and Microsoft Service like resource groups, resources (VM, disk, etc.), Azure blob storage, Azure storage replication. โ€ข configure private and public IP addresses, network routes, network interface, subnets, and virtual network on AWS/Microsoft Azure. โ€ข Troubleshooting, diagnosing, performance tuning, and solving the Hadoop issues. โ€ข Administration of Linux installation. โ€ข Fault finding, analysis and logging information for report. โ€ข Expert in administration of Kafka and deploying of UI tools to manage Kafka โ€ข Implementing HA for MySQL โ€ข Installing/Configuring Airflow for orchestration of jobs
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Apache Kafka
    Hortonworks
    Apache Hive
    Apache Airflow
    YARN
    Apache Hadoop
    Apache Zookeeper
    Apache Spark
    Cloudera
  • $25 hourly
    I have been working as a Cloudera Administrator in Telecommunication/Financial industry. I am installing/configuring/monitoring production clusters around 15 and 18 node clusters. I am available for install/configure/fixing issues and tuning your clusters. I have followings skills and experience: Cloudera Administrator Linux Administrator Crontab scheduling Shell Scripting Mysql MariaDB HDFS, Impala Hadoop, SQL , ETL,, Tera Data , RDBMS, NoSQL (MongoDB), Warehousing, SSRS, Data migration from one source to another. Cloudera Hadoop, Sqoop, Flume, HDFS, Big Data technologies. Performance monitoring Impala, Spark and hive jobs HDFS replication management Enable HA on masternodes and HDFS Worked on upgrading cluster, commissioning & decommissioning of Data Nodes, Name Node recovery, capacity planning, and slots configuration. Cluster installation from scratch Add services Configure services Created CM dashboard for navigating services. CM User access management. Resolving bad and concerning health issues. Hands on experience on Redhat 7.5 Hands on experience with mysql (mariadb) for services configuration Configured cloudera navigator for audit logs Enabled HDFS HA Rebalance HDFS data on all hosts LDAP configuration on Cloudera Manager and Hue for business users login. Configure email alerts on service bad health Linux System Administration on RHEL Cloudera Administration on production server Resolving Cluster Health Issues. Configuring services in cluster. Enable and disable nodes for performing hardware activity Writing shell scripts Adding and configure new data node. Resolving bad health issues Crontab job scheduling Schedule spark and sqoop jobs using shell scripts Strong hands-on experience working with Impala, Hive, HDFS, Spark and YARN Strong hands-on experience with LDAP configuration. Strong hands-on experience with Master nodes HA and HDFS and other services. Adding and removing host from cluster. Experience with configuring dedicated cluster for KAFKA. Adding and removing dataodes in a secure way. Installed and Configured ELK and Configured Elasticsearch with Hadoop cluster for fast performance. Configured Cloudera Navigator Mysql user management Sentry user management Linux user management Experience in implementing and ongoing administrating infrastructure including performance tuning. Troubleshooting Spark jobs. Adding custom dashboards of Cloudera services health and memory charts. Hue user management Role assignment LDAP user management ELK use case Installation and configuration Elasticsearch, Kibana, and Logstash on test & production environment Extracted logs using Grok pattern Created Kibana dashboard. Integrated ELK with hadoop cluster for fast performance. I can do all that at very reasonable costs. Feel free to discuss your project.
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Linux System Administration
    Informatica
    Big Data
    Hive Technology
    ETL Pipeline
    Apache Kafka
    Apache Hive
    Cluster Computing
    YARN
    Apache Hadoop
    Apache Spark
    Cloudera
    SQL
  • $90 hourly
    Hi. I've been working as a data professional for over 20 years. I have extensive, hands-on experience in all the following areas: - database design - data exploration/mining - data engineering & pipeline development - data analysis, reporting & visualization I'm truly a one-stop shop for all your analytical/reporting needs. I am a SQL expert on several different database platforms (Redshift, Hive/Hadoop, Oracle, SQL Server, MySQL, Vertica, Teradata and more). I'm a Tableau Desktop Certified Associate with over six years of extensive, hands-on development experience using this tool. You can view some of my sample Tableau work on Tableau Public gallery under my name (see my Portfolio section for a link to this gallery). I am also an MS Excel guru with over 15 years of experience using this tool. I have exceptional communication and presentation skills. I am equally comfortable and enjoy working with both non-technical business teams as well as with technical data engineering teams and can communicate will with both audiences. And, finally, I'm an Upwork Top Rated Freelancer with a 100% Job Success Score. Please refer to my Upwork Job History for details. I can also provide references for all of these projects. If you're looking for someone to help you build or enhance an existing reporting platform or just need some help analyzing your existing data, I can help. Check out my LinkedIn profile for more details on my experience. Please note that my LI profile does NOT include my freelance project-based work. Thanks, Mark
    vsuc_fltilesrefresh_TrophyIcon Apache Impala
    Dashboard
    Data Modeling
    Oracle Database
    Microsoft Excel
    Microsoft SQL Server Programming
    MySQL Programming
    Apache Hive
    Data Analysis
    Data Mining
    Database Design
    SQL
    Data Visualization
    Tableau
  • Want to browse more freelancers?
    Sign up

How it works

1. Post a job (itโ€™s free)

Tell us what you need. Provide as many details as possible, but donโ€™t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates youโ€™re excited about. Hire as soon as youโ€™re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by

How do I hire a Apache Impala Developer on Upwork?

You can hire a Apache Impala Developer on Upwork in four simple steps:

  • Create a job post tailored to your Apache Impala Developer project scope. We’ll walk you through the process step by step.
  • Browse top Apache Impala Developer talent on Upwork and invite them to your project.
  • Once the proposals start flowing in, create a shortlist of top Apache Impala Developer profiles and interview.
  • Hire the right Apache Impala Developer for your project from Upwork, the world’s largest work marketplace.

At Upwork, we believe talent staffing should be easy.

How much does it cost to hire a Apache Impala Developer?

Rates charged by Apache Impala Developers on Upwork can vary with a number of factors including experience, location, and market conditions. See hourly rates for in-demand skills on Upwork.

Why hire a Apache Impala Developer on Upwork?

As the worldโ€™s work marketplace, we connect highly-skilled freelance Apache Impala Developers and businesses and help them build trusted, long-term relationships so they can achieve more together. Let us help you build the dream Apache Impala Developer team you need to succeed.

Can I hire a Apache Impala Developer within 24 hours on Upwork?

Depending on availability and the quality of your job post, itโ€™s entirely possible to sign up for Upwork and receive Apache Impala Developer proposals within 24 hours of posting a job description.

Schedule a call