Hire the best Apache Spark Engineers in India
Check out Apache Spark Engineers in India with the skills you need for your next job.
- $35 hourly
- 5.0/5
- (12 jobs)
I have 18+ years of experience in software development in Telecom, Banking, and Healthcare domains. Primary skillsets include Big Data eco-systems (Apache Spark, Hive, Map Reduce, Cassandra), Scala, Core Java, Python, C++. I am well versed in designing and implementing Big data solutions, ETL and Data Pipelines, Serverless and event-driven architectures on Google Cloud Platform (GCP), and Cloudera Hadoop 5.5. I like to work with organizations to develop sustainable, scalable, and modern data-oriented software systems. - Keen eye on scalability, sustainability of the solution - Can come up with maintainable & good object-oriented designs quickly - Highly experienced in seamlessly working with remote teams effectively - Aptitude for recognizing business requirements and solving the root cause of the problem - Can quickly learn new technologies Sound experience in following technology stacks: Big Data: Apache Spark, Spark Streaming, HDFS, Hadoop MR, Hive, Apache Kafka, Cassandra, Google Cloud Platform (Dataproc, Cloud storage, Cloud Function, Data Store, Pub/Sub), Cloudera Hadoop 5.x Languages: Scala, Python, Java, C++, C, Scala with Akka and Play frameworks Build Tools: Sbt, Maven Databases: Postgres, Oracle, MongoDB/CosmosDB, Cassandra, Hive GCP Services: GCS, DataProc, Cloud functions, Pub/Sub, Data-store, BigQuery AWS Services: S3, VM, VM Auto-scaling Group, EMR, S3 Java APIs, Redshift Azure Services: Blob, VM, VM scale-set, Blob Java APIs, Synapse Other Tools/Technologies: Dockerization, Terraform Worked with different types of Input & Storage formats: CSV, XML, JSON file, Mongodb, Parquet, ORCApache Spark
C++JavaScalaApache HadoopPythonApache CassandraOracle PLSQLApache HiveClouderaGoogle Cloud Platform - $35 hourly
- 5.0/5
- (38 jobs)
Highly Skilled Data Engineer with diverse experience in the following areas: ✅ Data analysis and ETL solution expertise. ✅ Snowflake DB Expertise- Developer. ✅ DBT step, administration and development on both DBT cloud and DBT core. ✅ Azure Data Factory ✅ Sharepoint and Onedrive Integration using Microsoft Graph API ✅ Airflow Workflow / DAG development ✅ Matillion ETL ✅ Talend ETL Expert- Integration, Java Routines, data quality. ✅ Salesforce Integration. ✅ Google Cloud Platform - Cloud Function, Cloud Run, Data Proc, Pub-Sub, Bigquery. ✅ AWS- S3, Lambda, EC2, Redshift. ✅ Cloud Migration - work with Bulk data and generic code. ✅ Python automation and API Integration ✅ SQL reporting. ✅ Data Quality Analysis and Data Governance solution architecture design. ✅ Data Validation using Great expectations(python tool) P.S. Available to work US - EST hours on demand. I have good exposure to data integration, migration, transformation, cleansing, warehouse design, SQL, Functions, and procedures. - Databases: Snowflake, Oracle, PostgreSQL, Bigquery. - ETL Tools: Azure Data factory, Matillion, Talend Data Fabric with Java - DB Languages and tools: SQL, SnowSQL, DBT(Data Build Tool). - Workflow management tool: Airflow. - Scripting language - Python. - Python Frameworks: Pandas, Spark, Great Expectations, - Cloud Ecosystem: AWS, GCPApache Spark
PySparkMicrosoft AzuredbtApache HadoopGoogle Cloud PlatformETLTalend Data IntegrationSnowflakeAWS LambdaAPI IntegrationJavaScriptAmazon Web ServicesPythonApache Airflow - $35 hourly
- 5.0/5
- (3 jobs)
════ Who Am I? ════ Hi, nice to meet you! I'm Ajay, a Tableau and SQL Specialist, Business Intelligence Developer & Data Analyst with half a decade of experience working with data. For the last few years I've been helping companies all over the globe achieve their Data Goals and making friends on the journey. If you're looking for someone who can understand your needs, collaboratively develop the best solution, and execute a vision - you have found the right person! Looking forward to hearing from you! ═════ What do I do? (Services) ═════ ✔️ Tableau Reports Development & Maintenance - Pull data from (SQL Servers, Excel Files, Hive etc.) - Clean and transform data - Model relationships - Calculate and test measures - Create and test charts and filters - Build user interfaces - Publish reports ✔️ SQL - Build out the data and reporting infrastructure from the ground up using Tableau and SQL to provide real time insights into the product and business KPI's - Identified procedural areas of improvement through customer data, using SQL to help improve the probability of a program by 7% - Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs and Scala. ═════ How do I work? (Method) ═════ 1️⃣ First, we need a plan; I will listen, take notes, analyze and discuss your goals, how to achieve them, determine costs, development phases, and time involved to deliver the solution. 2️⃣ Clear and frequent communication; I provide frequent project updates and will be available to discuss important questions that come up along the way. 3️⃣ Stick to the plan; I will deliver, on time, what we agreed upon. If any unforeseen delay happens, I will promptly let you know and provide a new delivery date. 4️⃣ Deliver a high-quality product. My approach aims to deliver the most durable, secure, scalable, and extensible product possible. All development includes testing, documentation, and demo meetings.Apache Spark
Apache HivePython ScriptScalaMachine LearningHiveSQL ProgrammingBusiness IntelligenceMicrosoft ExcelMicrosoft Power BITableauSQLPython - $25 hourly
- 5.0/5
- (54 jobs)
* Experienced in Microsoft Fabric and Azure Synapse, with DP-500, DP-600 and DP-700 certifications.Highly skilled Power BI expert with Microsoft certifications (PL-100, PL-300), adept in data analysis, modeling, and visualization. * Proficient in M-Language (Power Query), DAX, T-SQL, and Microsoft Power Platform tools including Power Apps, Power Automate. * Extensive experience in managing Power BI infrastructure, including On-premises gateway and Premium capacity. * Expertise in implementing monitoring systems for Power BI infrastructure and optimizing report performance. * Dedicated to delivering high-quality results, meeting project deadlines, and exceeding client expectations. * Seeking freelance opportunities to leverage my skills and certifications in Power BI, Microsoft Fabric, and Azure Synapse to drive actionable insights and business success.Apache Spark
Databricks PlatformMicrosoft Azure SQL DatabaseMicrosoft PowerAppsMicrosoft Power AutomateMicrosoft SQL Server Reporting ServicesPower QueryBusiness IntelligenceMicrosoft Power BISQLPythonMicrosoft Excel - $25 hourly
- 4.9/5
- (7 jobs)
Hello, I’m Aditya Johar, a Data Scientist and Full Stack Developer with 9+ years of experience delivering innovative, tech-driven solutions. I focus on identifying areas where technology can reduce manual tasks, streamline workflows, and optimize resources By implementing smart automation solutions tailored to your specific needs, I can help your business cut costs, improve efficiency, and free up valuable time for more strategic, growth-focused initiatives. ---------------------------------TOP SOLUTIONS DEVELOPED--------------------------------- ✅Custom Software using Python (Django, Flask, FAST API), MERN/MEAN/MEVN Stacks ✅Interactive Data Visualization Dashboards - Power BI, Tableau, ETL etc ✅Intelligent Document Processing (IDP), RAG, LLMs, Chat GTP APIs ✅NLP: Sentiment Analysis, Text Summarization, Chatbots and Language Translation ✅COMPUTER VISION: Image and Video Classification, Object Detection, Face Recognition, Medical Image Analysis ✅RECOMMENDATION SYSTEMS: Product Recommendations (e.g., e-commerce), Content Recommendations (e.g., streaming services), Personalized Marketing ✅PREDICTIVE ANALYTICS: Sales and Demand Forecasting, Customer Churn Prediction, Stock Price Prediction, Equipment Maintenance Prediction ✅E-COMMERCE OPTIMIZATION: Dynamic Pricing, Inventory Management, Customer Lifetime Value Prediction ✅TIME SERIES ANALYSIS: Financial Market Analysis, Energy Consumption Forecasting, Weather Forecasting ✅SPEECH RECOGNITION: Virtual Call Center Agents, Voice Assistants (e.g., Siri, Alexa) ✅AI IN FINANCE: Credit Scoring, Algorithmic Trading, Fraud Prevention ✅AI IN HR: Candidate Screening, Employee Performance Analysis, Workforce Planning ✅CONVERSATIONAL AI: Customer Support Chatbots, Virtual Shopping Assistants, Voice Interfaces ✅AI IN EDUCATION: Personalized Learning Paths, Educational Chatbots, Plagiarism Detection ✅AI IN MARKETING: Customer Segmentation, Content Personalization, A/B Testing ✅SUPPLY CHAIN OPTIMIZATION: Demand Forecasting, Inventory Optimization, Route Planning And Many More use cases that we can discuss while we connect. "Ready to turn these possibilities into realities? I'm just a click away! Simply click the 'Invite to Job' or 'Hire Now' button in the top right corner of your screen."Apache Spark
DjangoApache AirflowApache HadoopTerraformPySparkApache KafkaFlaskBigQueryBERTPython Scikit-LearnpandasPythonTensorFlowData Science - $15 hourly
- 5.0/5
- (5 jobs)
Data Engineering || Azure|| Azure Functions || SQL || PySpark ||Python || ETl || Data Modeling ll Data Warehousing || MQTT || node-red || Influxdb || Grafana || KubernetesApache Spark
Data AnalyticsData WarehousingData IngestionPythonDatabaseBig DataCI/CDMicrosoft AzureMicrosoft Azure SQL DatabaseData AnalysisData LakeAzure Cosmos DB - $25 hourly
- 5.0/5
- (84 jobs)
10+ years of experience in full-stack and full-service data engineering/data visualization/data science/ AI-ML with enterprise clients such as Walmart, Procter & Gamble, Amazon, Johnson & Johnson etc. as well as SMEs like TheCreditPros, StructuredWeb, NorthCoastMedical, PoshPeanuts, EffectiveSpend, etc. Domain Experience: • Retail and E-commerce • Banking and Financial Services • Telecom • Sports & Gaming • Operations / ERP / CRM Analytics Tools Expertise: Data Engineering/ETL/Data Pipelines/Data Warehousing: • Talend Studio, Stitchdata, Denodo, Fivetran, CloverDX • AWS (Glue-RDS-Redshift-Step Functions-Lambda) • Azure (ADF-Data Lake-ADLS Gen2) • GCP (Cloud Composer- Cloud Functions-BigQuery) • SQL, MongoDB, DBT • ETL through Rest and SOAP APIs for Salesforce, Netsuite, Fulfil, Pardot, Facebook, Linkedin, Twitter, Instagram, Google Adwords, Yahoo Gemini, Bing Ads, Google Analytics, Zendesk, Mailchimp, Zoho, Five9 etc. • Data Streaming (Apache Spark, Flink, Flume, Kafka) • API/Webhook design through Python FastAPI + Uvicorn • Twilio, Asterisk Data Visualization/Business Intelligence: • Power BI (Pro-Premium-Embedded-Report Server, DAX/Power Query M) • Tableau (Prep-Cloud-Server-AI-Pulse, Functions/LOD Expressions) • Looker (Studio, Pro, LookML) • Qlik Sense • DOMO Voicebot-Chatbots: • NLP - LLM (Natural Language Processing - Large Language Models) • ChatGPT • Deepgram • Langchain • Llama2 • Falcon • deBerta • T5 • Bert Speech Engineering Tools/Techniques: • Kaldi / Speechbrain / Whisper / Nvidia Riva / EspNet / Bark • AWS-GCP-Azure ASR & TTS • Amazon AWS Polly, Transcribe, Translate • Automated Speech Recognition • Speaker Diarization • Wake Word Detection • Speech Biometrics • Intent Recognition • Speaker separationApache Spark
Talend Open StudioSnowflakeAmazon RedshiftBigQueryQlik SensedbtAWS GlueMicrosoft AzureQlikViewAutomatic Speech RecognitionSQLTableauMicrosoft Power BIDatabricks Platform - $20 hourly
- 5.0/5
- (40 jobs)
I have over 10 years of expertise working as a data engineering, data architecture, ETL development, and business intelligence (BI), My proficiency spans across end-to-end data pipeline architecture, advanced data modelling, performance tuning, and optimizing high-volume, distributed data systems using tools such as Talend, Informatica, and Apache NiFi, Apache Spark, Hadoop, and Snowflake, Apache Airflow, Luigi, and AWS Step Functions, leveraging Apache Kafka for real-time data streaming I have extensive experience working within rigorous compliance frameworks, including CMMI Level 5, SOC2, HIPAA, GDPR, ISO 27001, and PCI DSS, ensuring the highest levels of data governance, security, and privacy. My strong command of Python, coupled with libraries such as Pandas, NumPy, SciPy, Dask, and PySpark, enables efficient manipulation and analysis of massive datasets. I have a proven track record of building data lakes and data warehouses on cloud platforms like AWS, Google Cloud Platform (GCP), and Azure, ensuring high availability, scalability, and performance optimization. Additionally, my experience with containerization and DevOps practices using Docker, Kubernetes, and Jenkins enhances my ability to deliver scalable, production-grade data solutions that seamlessly integrate with CI/CD pipelines. In L2 and L3 support roles, I have successfully handled mission-critical issues, ensuring minimal downtime and system stability in highly sensitive, high-availability environments. A notable achievement includes leading the data migration during a telecom company acquisition, ensuring secure data transfers, maintaining data fidelity, and minimizing operational disruptions. - Cloud Platforms: AWS (S3, Redshift, Lambda, EMR, Glue), GCP (BigQuery, Dataflow), Azure (Data Lake, Synapse) - Data Engineering Tools: Apache Spark, Hadoop, Snowflake, Kafka, Airflow, Luigi, AWS Step Functions - Big Data Ecosystem: HDFS, Hive, HBase, Presto, Flink, Drill, Elasticsearch - ETL Tools: Talend, Informatica, Apache NiFi, AWS Glue, Fivetran - Databases: PostgreSQL, MySQL, MongoDB, Cassandra, Oracle, DynamoDB, Redis - Data Warehousing: Redshift, BigQuery, Snowflake, Azure Synapse Analytics - BI Tools: Tableau, Power BI, Looker, QlikView - Python Libraries: Pandas, NumPy, SciPy, Dask, PySpark, Matplotlib, Seaborn, Scikit-learn, TensorFlow, Keras - Programming Languages: Python, SQL, Java, Scala, Shell Scripting - DevOps Tools: Docker, Kubernetes, Jenkins, Terraform, Ansible - Data Visualization: Tableau, Power BI, Looker Studio, Matplotlib, PlotlyApache Spark
TableauMicrosoft Power BIAWS GlueSnowflakePythonData Science ConsultationAI DevelopmentAI ConsultingData Analytics & Visualization SoftwareETL PipelineData WarehousingData LakeData AnalyticsData Engineering - $40 hourly
- 3.0/5
- (10 jobs)
As a Senior Data Engineer with 9 years of extensive experience in the Data Engineering with Python ,Spark, Databricks, ETL Pipelines, Azure and AWS services, develop PySpark scripts and store data in ADLS using Azure Databricks. Additionally, I have created data pipelines for reading streaming data from MongoDB and developed Neo4j graphs based on stream-based data. I am well-versed in designing and modeling databases using Neo4j and MongoDB. I am seeking a challenging opportunity in a dynamic organization that can enhance my personal and professional growth while enabling me to make valuable contributions towards achieving the company's objectives. • Utilizing Azure Databricks to develop PySpark scripts and store data in ADLS. • Developing producers and consumers for stream-based data using Azure Event Hub. • Designing and modeling databases using Neo4j and MongoDB. • Creating data pipelines for reading streaming data from MongoDB. • Creating Neo4j graphs based on stream-based data. • Visualizing data for supply-demand analysis using Power BI. • Developing data pipelines on Azure to integrate Spark notebooks. • Developing ADF pipelines for a multi-environment and multi-tenant application. • Utilizing ADLS and Blob storage to store and retrieve data. • Proficient in Spark, HDFS, Hive, Python, PySpark, Kafka, SQL, Databricks, and Azure, AWS technologies. • Utilizing AWS EMR clusters to execute Hadoop ecosystems such as HDFS, Spark, and Hive. • Experienced in using AWS DynamoDB for data storage and caching data on Elasticache. • Involved in data migration projects that move data from SQL and Oracle to AWS S3 or Azure storage. • Skilled in designing and deploying dynamically scalable, fault-tolerant, and highly available applications on the AWS cloud. • Executed transformations using Spark, MapReduce, loaded data into HDFS, and utilized Sqoop to extract data from SQL into HDFS. • Proficient in working with Azure Data Factory, Azure Data Lake, Azure Databricks, Python, Spark, and PySpark. • Implemented a cognitive model for telecom data using NLP and Kafka cluster. • Competent in big data processing utilizing Hadoop, MapReduce, and HDFS.Apache Spark
Microsoft Azure SQL DatabaseSQLMongoDBData EngineeringMicrosoft AzureApache KafkaApache HadoopAWS GluePySparkDatabricks PlatformHive TechnologyAzure Cosmos DBApache HivePython - $35 hourly
- 5.0/5
- (11 jobs)
Shaikh is an experienced Certified Cloud Data Engineer with over three years of expertise in designing end-to-end ETL pipelines. He is passionate about unlocking the value of data and believes in its power to drive business growth. His skills are rooted in his experience working with Google Cloud Platform (GCP). Shaikh can help you leverage GCP services such as BigQuery, Bigtable, Data Studio (now Looker Studio), Cloud Functions, Cloud Storage, Cloud Scheduler, Scheduled Queries, Cloud SQL, Dataflow, Datafusion, and more. His expertise can empower your organization to efficiently manage and analyze large datasets, improve data-driven decision-making, and derive valuable insights.Apache Spark
ETL PipelineApache BeamGoogle AnalyticsMicrosoft PowerPointBigQueryDatabricks PlatformGoogle Cloud PlatformApache NiFiSnowflakeSQLGoogle SheetsPython - $70 hourly
- 4.7/5
- (7 jobs)
I have 12+ years of hands on in Big data technology based on Scala Framework. Designed Architecture for several projects. Also, I'm proficient in Akka , spark and this technology has currently a priority in my area of interests. Experienced multiple Domains. My Skill set include - Scala, AWS, Angular 2+ Scala stack: - Akka - Play Framework - Spray (Akka IO) - Spark - Play Env tools: - PostgreSQL / MongoDB / ElasticSearch - RabbitMQ - Kafka Front-end technologies: - Angular 2+ I am GCP certified and have hands-on experience with multiple cloud providers, including GCP, AWS, and OpenShift. I have been working on AWS services for several years, and my recent projects include: Building Terraform modules to build CI/CD pipelines with AWS CodePipeline 1. Designing and developing software applications 2. Automating and managing infrastructure with Terraform 3. Deploying and operating applications in the cloud 4. Working with a variety of cloud technologies, including AWS, GCP, and OpenShift I am passionate about security and cost optimization, and I believe this sets me apart from other consultants. I am always looking for ways to improve the security of my client's systems and reduce their costs. I am a highly skilled and experienced IT professional with a passion for cloud computing. I am confident that I have the skills and knowledge necessary to be a valuable asset to your team. My focus is on building practical business solutions and looking for opportunities to expand my knowledge and take part in projects that present a challenge. Dedicated to exceeding your expectations with highest-quality solutions, delivered on time and to your precise needs.Apache Spark
Angular 4ScalaAkkaApache CassandraApache KafkaMongoDB - $30 hourly
- 4.6/5
- (22 jobs)
Highly experienced in IT sector with Lead roles. ( IT 10+ Years, after Master of Computer Applications (MCA) ). Stregnth: * Excellent as a Code Developer * Solution Leader. * Problem detecting & solving enthusiast * Proactive in suggesting the client and committed to the word given to the client. * Driven to the core for Speed, Optimization, "Bugs Cleaning" and the Scalability of the Projects. * With me, your company will get the creativity and advantage extra edge over the competition. General Experience: * 7 Years of experience as a software developer. * 3 Years of experience as a Senior * 2 Years of experience as a Team Leader Skills: Java,PHP,Angular,Vue,React,Wordpress,Laravel, Hadoop * Master of Computer Application * Have full stack knowledge of industry Companies and projects: * Samsung-Team Lead * RBS (Royal Bank of Scotland)- Team Lead * NCR Corporation - Team Lead * Accenture - Developer * Honda Insurance- Sr. DeveloperApache Spark
Oracle DatabaseAgile Software DevelopmentApache HiveHibernateMongoDBScrumApache HadoopJ2EEMachine LearningGitApache Struts 2Web ServiceApache KafkaSpring Framework - $25 hourly
- 5.0/5
- (10 jobs)
Highly creative and multi-talented Data Science with extensive experience in scrapping, data cleaning, and visualization. Exceptional collaborative and interpersonal skills; dynamic team player with well-developed written and verbal communication abilities. Highly skilled in client and vendor relations and negotiations; talented at building and maintaining “win-win” partnerships. Passionate and inventive creator of innovative marketing strategies and campaigns; accustomed to performing in deadline-driven environments with an emphasis on working within budget requirements.Apache Spark
SEO Keyword ResearchMathematical ModelingWordPressSEO BacklinkingData AnalysisVisual Basic for ApplicationsStatistical ComputingSEO AuditFlaskData ScienceMachine LearningData MiningpandasPythonChatbot - $80 hourly
- 5.0/5
- (28 jobs)
𝟭𝟬+ 𝘆𝗲𝗮𝗿𝘀 𝗼𝗳 𝗘𝘅𝗽𝗲𝗿𝗶𝗲𝗻𝗰𝗲 | 𝗘𝘅𝗽𝗲𝗿𝘁-𝗩𝗲𝘁𝘁𝗲𝗱 (𝗧𝗼𝗽 𝟭%) 𝗳𝗿𝗲𝗲𝗹𝗮𝗻𝗰𝗲𝗿 | 𝗪𝗼𝗿𝗸𝗲𝗱 𝘄𝗶𝗵 𝗚𝗼𝗹𝗱𝗺𝗮𝗻 𝗦𝗮𝗰𝗵𝘀, 𝗠𝗼𝗿𝗴𝗮𝗻 𝗦𝘁𝗮𝗻𝗹𝗲𝘆, 𝗞𝗠𝗣𝗚, 𝗢𝗿𝗮𝗰𝗹𝗲 𝗲𝘁𝗰. I take pride in maintaining a 𝗽𝗲𝗿𝗳𝗲𝗰𝘁 𝗿𝗲𝗰𝗼𝗿𝗱 𝗼𝗳 𝟱-𝘀𝘁𝗮𝗿 𝗿𝗮𝘁𝗶𝗻𝗴𝘀 𝗮𝗰𝗿𝗼𝘀𝘀 𝗮𝗹𝗹 𝗽𝗿𝗼𝗷𝗲𝗰𝘁𝘀. My expertise is strongly backed by 𝗳𝘂𝗹𝗹-𝘀𝘁𝗮𝗰𝗸 𝗱𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 and 𝗰𝗹𝗼𝘂𝗱 𝗱𝗮𝘁𝗮 𝗲𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴 𝘀𝗸𝗶𝗹𝗹𝘀, honed through work with leading institutions. With over 10+ years of experience in Data Engineering and Programming, I bring a commitment to excellence and a passion for perfection in every project I undertake. My approach is centered around delivering not just functional, but 𝗵𝗶𝗴𝗵𝗹𝘆 𝗲𝗳𝗳𝗶𝗰𝗶𝗲𝗻𝘁 𝗮𝗻𝗱 𝗼𝗽𝘁𝗶𝗺𝗶𝘇𝗲𝗱 code, ensuring top-quality outputs that consistently impress my clients. My expertise combined with extensive experience on both GCP and AWS Cloud platforms, allows me to provide solutions that are not only effective but also innovative and forward-thinking. I believe in going beyond the basics, striving for excellence in every aspect of my work, and delivering results that speak for themselves. 𝗖𝗵𝗼𝗼𝘀𝗲 𝗺𝗲 𝗶𝗳 𝘆𝗼𝘂 𝗽𝗿𝗶𝗼𝗿𝗶𝘁𝗶𝘇𝗲 𝘁𝗼𝗽-𝗻𝗼𝘁𝗰𝗵 𝗾𝘂𝗮𝗹𝗶𝘁𝘆 𝗶𝗻 𝘆𝗼𝘂𝗿 𝗽𝗿𝗼𝗷𝗲𝗰𝘁𝘀 𝗮𝗻𝗱 𝗮𝗽𝗽𝗿𝗲𝗰𝗶𝗮𝘁𝗲 𝗮 𝗳𝗿𝗲𝗲𝗹𝗮𝗻𝗰𝗲𝗿 𝘄𝗵𝗼 𝗮𝘂𝘁𝗼𝗻𝗼𝗺𝗼𝘂𝘀𝗹𝘆 𝗺𝗮𝗸𝗲𝘀 𝗼𝗽𝘁𝗶𝗺𝗮𝗹 𝗱𝗲𝗰𝗶𝘀𝗶𝗼𝗻𝘀, 𝘀𝗲𝗲𝗸𝗶𝗻𝗴 𝗰𝗹𝗮𝗿𝗶𝗳𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀 𝗼𝗻𝗹𝘆 𝘄𝗵𝗲𝗻 𝗮𝗯𝘀𝗼𝗹𝘂𝘁𝗲𝗹𝘆 𝗻𝗲𝗰𝗲𝘀𝘀𝗮𝗿𝘆. ❝ 𝗥𝗲𝗰𝗼𝗴𝗻𝗶𝘇𝗲𝗱 𝗮𝘀 𝗨𝗽𝘄𝗼𝗿𝗸'𝘀 𝗧𝗼𝗽 𝟭% 𝗧𝗮𝗹𝗲𝗻𝘁 𝗮𝗻𝗱 𝗮𝗻 𝗲𝘅𝗽𝗲𝗿𝘁-𝘃𝗲𝘁𝘁𝗲𝗱 𝗽𝗿𝗼𝗳𝗲𝘀𝘀𝗶𝗼𝗻𝗮𝗹 ❞ 𝗔𝗿𝗲𝗮𝘀 𝗼𝗳 𝗘𝘅𝗽𝗲𝗿𝘁𝗶𝘀𝗲: - 𝗖𝗹𝗼𝘂𝗱: GCP (Google Cloud Platform), AWS (Amazon Web Services) - 𝗣𝗿𝗼𝗴𝗿𝗮𝗺𝗺𝗶𝗻𝗴 𝗟𝗮𝗻𝗴𝘂𝗮𝗴𝗲: Java, Scala, Python, Ruby, HTML, Javascript - 𝗗𝗮𝘁𝗮 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴: Spark, Kafka, Crunch, MapReduce, Hive, HBase, AWS Glue, PySpark, BiqQuery, Snowflake, ETL, Datawarehouse, Databricks, Data Lake, Airflow, Cloudwatch 𝗖𝗹𝗼𝘂𝗱 𝗧𝗼𝗼𝗹𝘀: AWS Lambda, Cloud Functions, App Engine, Cloud Run, Datastore, EC2, S3, - 𝗗𝗲𝘃𝗢𝗽𝘀: GitHub, GitLab. BitBucket, CHEF, Docker, Kubernetes, Jenkins, Cloud Deploy, Cloud Build, - 𝗪𝗲𝗯 & 𝗔𝗣𝗜: SpringBoot, Jersey, Flask, HTML & JSP, ReactJS, Django 𝗥𝗲𝘃𝗶𝗲𝘄𝘀: ❝ Amar is a highly intelligent and experienced individual who is exceeding expectations with his service. He has very deep knowledge across the entire field of data engineering and is a very passionate individual, so I am extremely happy to have finished my data engineering project with such a responsible fantastic guy. I was able to complete my project faster than anticipated. Many thanks.... ❞ ❝ Amar is an exceptional programmer that is hard to find on Upwork. He combines top-notch technical skills in Python & Big Data, excellent work ethic, communication skills, and strong dedication to his projects. Amar systematically works to break down complex problems, plan an approach, and implement thought-out high-quality solutions. I would highly recommend Amar! ❞ ❝ Amar is a fabulous developer. He is fully committed. Is not a clock watcher. Technically very very strong. His Java and Python skills are top-notch. What I really like about him is his attitude of taking a technical challenge personally and putting in a lot of hours to solve that problem. Best yet, he does not charge the client for all those hours, He still sticks to the agreement. Very professional. It was a delight working with him. and Will reach out to him if I have a Java or Python task. ❞ With 10+ years of experience and recognition as an Expert-Vetted (Top 1%) freelancer, I’ve delivered exceptional results for top organizations like Goldman Sachs, Morgan Stanley, and KPMG. I’m confident I can be the perfect fit for your project—let’s connect to discuss how I can help achieve your goals!Apache Spark
API DevelopmentFlaskGoogle App EngineSoftware DevelopmentBig DataGoogle Cloud PlatformAmazon Web ServicesBigQueryPySparkApache AirflowData EngineeringSQLPythonJava - $50 hourly
- 5.0/5
- (3 jobs)
Having a hands on experience on developing Analytics and Machine Learning, Data Science, Big Data and AWS Solutions.Apache Spark
Apache CordovaCloud ServicesAnalyticsPySparkData SciencePythonMachine Learning - $45 hourly
- 5.0/5
- (10 jobs)
I have 7+ years of experience as a Software engineer. I love working with big data, building ML pipelines, and data engineering in general. I have designed and developed scalable cloud services on Azure as well as AWS. I also contribute to open source projects (I have co-authored a python package called nbQA). I am open to exploring exciting long-term projects. My hourly rate could vary depending on the project. If you feel my skills match your requirement feel free to invite me for the job. Languages ========= Python, Scala, C#, Typescript, Bash Frameworks ========= Apache Spark, Scikit-learn, Pandas, Keras, Tensorflow, FastAPI, Spacy, Scrapy Cloud ===== AWS Lambda, AWS Fargate, AWS Sagemaker, Azure Functions, Azure Data Factory, Azure Databricks, and various other Azure services MISC ==== ElasticSearch, Postgres, Redis, Kafka, Docker, Docker Swarm I am experienced in mentoring engineers to pick up and apply new technology stacks. Feel free to reach out to me if you need mentoring/help with any of the technologies listed below.Apache Spark
KerasAWS LambdaMicrosoft AzureNatural Language ProcessingFlaskTensorFlowBash ProgrammingMachine LearningSnowflakeElasticsearchApache KafkaPythonScala - $35 hourly
- 5.0/5
- (32 jobs)
Seasoned data engineer with over 11 years of experience in building sophisticated and reliable ETL applications using Big Data and cloud stacks (Azure and AWS). TOP RATED PLUS . Collaborated with over 20 clients, accumulating more than 2000 hours on Upwork. 🏆 Expert in creating robust, scalable and cost-effective solutions using Big Data technologies for past 9 years. 🏆 The main areas of expertise are: 📍 Big data - Apache Spark, Spark Streaming, Hadoop, Kafka, Kafka Streams, Trino, HDFS, Hive, Solr, Airflow, Sqoop, NiFi, Flink 📍 AWS Cloud Services - AWS S3, AWS EC2, AWS Glue, AWS RedShift, AWS SQS, AWS RDS, AWS EMR 📍 Azure Cloud Services - Azure Data Factory, Azure Databricks, Azure HDInsights, Azure SQL 📍 Google Cloud Services - GCP DataProc 📍 Search Engine - Apache Solr 📍 NoSQL - HBase, Cassandra, MongoDB 📍 Platform - Data Warehousing, Data lake 📍 Visualization - Power BI 📍 Distributions - Cloudera 📍 DevOps - Jenkins 📍 Accelerators - Data Quality, Data Curation, Data CatalogApache Spark
SQLAWS GluePySparkApache CassandraETL PipelineApache HiveApache NiFiApache KafkaBig DataApache HadoopScala - $60 hourly
- 5.0/5
- (7 jobs)
Senior Software Engineer with 7 years of experience in functional programming, machine learning, AI & BigData. Also got front-end experience building websites and tools.Apache Spark
Functional ProgrammingReactBig DataApache KafkaAkkaApache CassandraAmazon DynamoDBDatabricks PlatformMachine LearningPythonScalaJavaScript - $65 hourly
- 4.9/5
- (94 jobs)
✅ 𝗘𝘅𝗽𝗲𝗿𝘁-𝗩𝗲𝘁𝘁𝗲𝗱 𝗧𝗮𝗹𝗲𝗻𝘁 𝗘𝗧𝗟 𝗙𝗿𝗲𝗲𝗹𝗮𝗻𝗰𝗲𝗿 𝗼𝗻 𝗨𝗽𝘄𝗼𝗿𝗸, 𝘀𝗽𝗲𝗰𝗶𝗮𝗹𝗶𝘇𝗶𝗻𝗴 𝗶𝗻 𝗘𝗧𝗟 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁, 𝗗𝗮𝘁𝗮 𝗪𝗮𝗿𝗲𝗵𝗼𝘂𝘀𝗲 𝗠𝗮𝗻𝗮𝗴𝗲𝗿 & 𝗕𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝗰𝗲 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 (𝗗𝗮𝘁𝗮 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿). 𝐈𝐟 𝐲𝐨𝐮 𝐧𝐞𝐞𝐝 𝐡𝐞𝐥𝐩 𝐰𝐢𝐭𝐡 𝐦𝐨𝐝𝐞𝐫𝐧 𝐝𝐚𝐭𝐚 𝐬𝐭𝐚𝐜𝐤 𝐜𝐫𝐞𝐚𝐭𝐢𝐨𝐧, 𝐚𝐧𝐚𝐥𝐲𝐭𝐢𝐜𝐬 𝐞𝐧𝐠𝐢𝐧𝐞𝐞𝐫𝐢𝐧𝐠, 𝐜𝐨𝐧𝐬𝐮𝐥𝐭𝐢𝐧𝐠, 𝐨𝐫 𝐚𝐝𝐯𝐚𝐧𝐜𝐞𝐝 𝐚𝐧𝐚𝐥𝐲𝐭𝐢𝐜𝐬, 𝐲𝐨𝐮’𝐯𝐞 𝐫𝐞𝐚𝐜𝐡𝐞𝐝 𝐭𝐡𝐞 𝐫𝐢𝐠𝐡𝐭 𝐔𝐩𝐰𝐨𝐫𝐤 𝐩𝐫𝐨𝐟𝐢𝐥𝐞. ✅ 𝐓𝐨𝐠𝐞𝐭𝐡𝐞𝐫 𝐰𝐢𝐭𝐡 𝐦𝐲 𝐭𝐞𝐚𝐦, 𝐰𝐞’𝐫𝐞 𝐥𝐨𝐨𝐤𝐢𝐧𝐠 𝐟𝐨𝐫𝐰𝐚𝐫𝐝 𝐭𝐨 𝐛𝐞𝐜𝐨𝐦𝐢𝐧𝐠 𝐲𝐨𝐮𝐫 𝐟𝐚𝐯𝐨𝐮𝐫𝐢𝐭𝐞 𝐧𝐞𝐰 𝐝𝐚𝐭𝐚 𝐧𝐞𝐫𝐝𝐬! 📈 📞 Invite me to your job on Upwork to book a free 15-minute consultation call together! 📞 ⭐⭐⭐⭐⭐ ❝ 𝙄 𝙝𝙖𝙫𝙚 𝙝𝙖𝙙 𝙩𝙝𝙚 𝙥𝙡𝙚𝙖𝙨𝙪𝙧𝙚 𝙤𝙛 𝙬𝙤𝙧𝙠𝙞𝙣𝙜 𝙬𝙞𝙩𝙝 𝙅𝙖𝙮 𝙛𝙤𝙧 𝙩𝙝𝙚 𝙡𝙖𝙨𝙩 𝙮𝙚𝙖𝙧. 𝙅𝙖𝙮 𝙞𝙨 𝙖 𝙙𝙚𝙫𝙚𝙡𝙤𝙥𝙚𝙧 𝙤𝙣 𝙢𝙮 𝙩𝙚𝙖𝙢, 𝙖𝙣𝙙 𝙞𝙨 𝙨𝙤𝙢𝙚𝙤𝙣𝙚 𝙬𝙝𝙤 𝙄 𝙝𝙖𝙫𝙚 𝙘𝙤𝙢𝙚 𝙩𝙤 𝙧𝙚𝙡𝙮 𝙖𝙣𝙙 𝙙𝙚𝙥𝙚𝙣𝙙 𝙤𝙣. 𝙒𝙝𝙚𝙣 𝙅𝙖𝙮 𝙨𝙖𝙮𝙨 𝙝𝙚 𝙬𝙞𝙡𝙡 𝙙𝙤 𝙨𝙤𝙢𝙚𝙩𝙝𝙞𝙣𝙜, 𝙝𝙚 𝙙𝙤𝙚𝙨 𝙞𝙩: 𝙤𝙣-𝙩𝙞𝙢𝙚, 𝙖𝙨 𝙧𝙚𝙦𝙪𝙚𝙨𝙩𝙚𝙙, 𝙖𝙣𝙙 𝙝𝙚 𝙙𝙤𝙚𝙨 𝙞𝙩 𝙧𝙞𝙜𝙝𝙩 𝙩𝙝𝙚 𝙛𝙞𝙧𝙨𝙩 𝙩𝙞𝙢𝙚. 𝙅𝙖𝙮 𝙞𝙨 𝙖𝙡𝙨𝙤 𝙫𝙚𝙧𝙮 𝙧𝙚𝙨𝙥𝙤𝙣𝙨𝙞𝙫𝙚 𝙩𝙤 𝙩𝙝𝙚 𝙩𝙚𝙖𝙢 𝙙𝙚𝙨𝙥𝙞𝙩𝙚 𝙩𝙝𝙚 𝙩𝙞𝙢𝙚-𝙯𝙤𝙣𝙚 𝙙𝙞𝙛𝙛𝙚𝙧𝙚𝙣𝙘𝙚𝙨 𝙞𝙣 𝙩𝙝𝙚 𝘾𝙤𝙫𝙞𝙙-𝙚𝙧𝙖. 𝙅𝙖𝙮 𝙧𝙚𝙨𝙥𝙤𝙣𝙙𝙨 𝙦𝙪𝙞𝙘𝙠𝙡𝙮 𝙖𝙣𝙙 𝙚𝙛𝙛𝙚𝙘𝙩𝙞𝙫𝙚𝙡𝙮 𝙬𝙝𝙚𝙣𝙚𝙫𝙚𝙧 𝙨𝙤𝙢𝙚𝙩𝙝𝙞𝙣𝙜 𝙪𝙣𝙚𝙭𝙥𝙚𝙘𝙩𝙚𝙙 𝙘𝙤𝙢𝙚𝙨 𝙪𝙥, 𝙖𝙣𝙙 𝙝𝙚 𝙥𝙧𝙤𝙫𝙞𝙙𝙚𝙨 𝙨𝙚𝙖𝙢𝙡𝙚𝙨𝙨 𝙘𝙤𝙫𝙚𝙧𝙖𝙜𝙚 𝙛𝙤𝙧 𝙩𝙝𝙚 𝙐𝙎-𝙗𝙖𝙨𝙚𝙙 𝙩𝙚𝙖𝙢. 𝙅𝙖𝙮 𝙞𝙨 𝙥𝙡𝙚𝙖𝙨𝙖𝙣𝙩 𝙖𝙣𝙙 𝙥𝙧𝙤𝙛𝙚𝙨𝙨𝙞𝙤𝙣𝙖𝙡, 𝙥𝙖𝙩𝙞𝙚𝙣𝙩 𝙬𝙞𝙩𝙝 𝙝𝙞𝙨 𝙡𝙚𝙨𝙨 𝙚𝙭𝙥𝙚𝙧𝙞𝙚𝙣𝙘𝙚𝙙 𝙘𝙤𝙡𝙡𝙚𝙖𝙜𝙪𝙚𝙨, 𝙖𝙣𝙙 𝙘𝙤𝙡𝙡𝙖𝙗𝙤𝙧𝙖𝙩𝙞𝙫𝙚 𝙬𝙞𝙩𝙝 𝙘𝙤𝙡𝙡𝙚𝙖𝙜𝙪𝙚𝙨 𝙤𝙛 𝙖𝙡𝙡 𝙨𝙠𝙞𝙡𝙡 𝙡𝙚𝙫𝙚𝙡𝙨. 𝙒𝙝𝙚𝙣 𝙤𝙩𝙝𝙚𝙧𝙨 𝙖𝙧𝙚 𝙨𝙩𝙪𝙢𝙥𝙚𝙙, 𝙅𝙖𝙮 𝙙𝙞𝙜𝙨 𝙞𝙣 𝙩𝙤 𝙨𝙤𝙡𝙫𝙚 𝙥𝙧𝙤𝙗𝙡𝙚𝙢𝙨 𝙖𝙣𝙙 𝙧𝙚𝙘𝙤𝙢𝙢𝙚𝙣𝙙 𝙨𝙤𝙡𝙪𝙩𝙞𝙤𝙣𝙨. 𝙅𝙖𝙮 𝙞𝙨 𝙖 𝙨𝙤𝙡𝙞𝙙 𝙝𝙞𝙧𝙚: 𝙞𝙛 𝙮𝙤𝙪 𝙝𝙖𝙫𝙚 𝙩𝙝𝙚 𝙤𝙥𝙥𝙤𝙧𝙩𝙪𝙣𝙞𝙩𝙮 𝙩𝙤 𝙗𝙧𝙞𝙣𝙜 𝙝𝙞𝙢 𝙤𝙣 𝙮𝙤𝙪𝙧 𝙩𝙚𝙖𝙢, 𝙮𝙤𝙪 𝙬𝙞𝙡𝙡 𝙗𝙚 𝙜𝙡𝙖𝙙 𝙮𝙤𝙪 𝙙𝙞𝙙. 𝙄𝙛 𝙮𝙤𝙪 𝙝𝙖𝙫𝙚 𝙖𝙣𝙮 𝙛𝙤𝙡𝙡𝙤𝙬 𝙪𝙥 𝙦𝙪𝙚𝙨𝙩𝙞𝙤𝙣𝙨, 𝙥𝙡𝙚𝙖𝙨𝙚 𝙚𝙢𝙖𝙞𝙡 𝙢𝙚 𝙖𝙣𝙙 𝙄 𝙬𝙞𝙡𝙡 𝙗𝙚 𝙝𝙖𝙥𝙥𝙮 𝙩𝙤 𝙚𝙡𝙖𝙗𝙤𝙧𝙖𝙩𝙚.❞ 🗣 Jeffrey Jaye - Technical Product and Program Manager - Comcast 🗣 WHO I WORK WITH ✅ Warner Bros, Discovery (Fortune 500 Comapny) ✅ Comcast (Fortune 500 Company) ✅ Snagajob (Enterprise client on Upwork) ✅ Fourth ✅ Unlimited Tech Solutions ✅ Thrive Internet Marketing Agency ✅ The Investor Machine ✅ The Ross Preschoool ✅ BI4ALL 🌟 WHY CHOOSE ME OVER OTHER FREELANCERS SOLUTION? 🌟 🥇Expert in Pentaho, Talend, Informatica, Snaplogic as an ETL tools. 🥇Determines data storage needs to perform ETL functions. 🥇Uses different data warehousing concepts to build a data warehouse for internal departments of the organization. 🥇Creates and enhances data solutions enabling seamless delivery of data and is responsible for collecting, parsing, managing and analyzing large sets of data. 🥇Leads the design of the logical data model and implements the physical database structure and constructs and implements operational data stores and data marts. 🥇Designs develop, automates, and support complex applications to extract, transform, and load data. 🥇Ensures data quality after ETL action performed. 🥇Develops logical and physical data flow models for ETL applications. 🥇Translates data access, transformation, and movement requirements into functional requirements and mapping designs. 🥇Transfer data format from source to destination. 🌟 WHY CHOOSE ME OVER OTHER FREELANCERS? 🌟 ✅ Client Reviews: I focus on providing VALUE to all of my Clients and Earning their TRUST. The Client Reviews and Feedback on my Profile are immensely important to me and the value that I provide. ✅ Over-Delivering: This is core to my work as a Freelancer. My focus is on GIVING more than what I expect to RECEIVE. I take pride in leaving all of my Clients saying "WOW" ✅ Responsiveness: Being extremely responsive and keeping all lines of communication readily open with my Clients. ✅ Resilience: Reach out to any of my Current of Former Clients and ask them about my Resilience. Any issue that my Clients face, I attack them and find a SOLUTION. ✅ Kindness: One of the biggest aspects of my life that I implement in every facet of my life. Treating everyone with respect, understanding all situations, and genuinely wanting to IMPROVE my Client's situations.Apache Spark
Amazon Web ServicesBuild AutomationZapierTalend Data IntegrationBash ProgrammingPySparkAirtablePentahoData EngineeringMake.comSnapLogicETLJavaScriptPython - $50 hourly
- 5.0/5
- (8 jobs)
"She is very good in coding. She is the best and to go person for any hadoop or nifi requirements." "Abha is a star; have successfully handed the project in a very professional manner. I will definitely be working with Abha again; I am very happy with the quality of the work. 🙏" "Abha Kabra is one of the most talented programmers I have ever meet in Upwork. Her communication was top-notch, she met all deadlines, a skilled developer and super fast on any task was given to her. Perfect work is done. Would re-hire and highly recommended!!" Highly skilled and experienced Bigdata engineer with over 6 years of experience in the field. With a strong background in Analysis, Data Migration, Design, and Development of Big Data and Hadoop based Projects using technologies like following: ✅ Apache spark with Scala & python ✅ Apache NiFi ✅ Apache Kafka ✅ Apache Airflow ✅ ElasticSearch ✅ Logstash ✅ Kibana ✅ Mongodb ✅ Grafana ✅ Azure data factory ✅ Azure pipelines ✅ Azure databricks ✅ AWS EMR ✅ AWS S3 ✅ AWS Glue ✅ AWS Lambda ✅ GCP ✅ cloud functions ✅ PostgreSql ✅ MySql ✅ Oracle ✅ MongoDB ✅ Ansible ✅ Terraform ✅ Logo/Book Cover Design ✅ Technical Blog writing A proven track record of delivering high-quality work that meets or exceeds client expectations. Deep understanding of Energy-Related data, IoT devices, Hospitality industry, Retail Market, Ad-tech, Data encryptions-related projects, and has worked with a wide range of clients, from Marriott, P&G, Vodafone UK, eXate UK etc. Able to quickly understand client requirements and develop tailored solutions that address their unique needs. Very communicative and responsive, ensuring that clients are kept informed every step of the way. A quick learner and is always eager to explore new technologies and techniques to better serve clients. Familiar with Agile Methodology, Active participation in Daily Scrum meetings, Sprint meetings, and retrospective meetings, know about working in all the phases of the project life cycle. A strong team player and a leader with good interpersonal and communication skills and ready to take independent challenges.Apache Spark
Apache NiFiPySparkDatabricks PlatformETL PipelineBig DataGrafanaKibanaApache KafkaPostgreSQLMicrosoft AzureMongoDBScalaPythonElasticsearchGoogle Cloud PlatformAmazon Web Services - $75 hourly
- 5.0/5
- (24 jobs)
Certified TOGAF 9 Enterprise Architect with over 18 years of IT service experience, specializing in solution architecture, innovation, consulting, and leading diverse projects. My extensive background in IT services has honed my skills in consulting, architecture, and software development. I am now focused on leveraging these skills in AI, Machine Learning, Data Lakes, and Analytics, seeking opportunities that challenge me to continue learning and applying cutting-edge technologies in real-world applications. Recent Projects and Specializations: Artificial Intelligence & Machine Learning: Developed several generative AI projects including: A solution for manufacturing operators that provides real-time fixes based on user-generated prompts and descriptions. An AI-driven healthcare lab assistant that suggests diagnostic tests based on user inputs. Advanced ML algorithms for monitoring pH levels in sugar production, crucial for maintaining quality control over product consistency. Implemented an ML model for HVAC systems that predicts power consumption spikes and potential breakdowns, enhancing maintenance efficiency and energy management. Data Science & Big Data: Expertise in handling large-scale data environments from terabytes to petabytes, developing actionable insights across multiple domains including Retail, Finance, Manufacturing, IoT, and Healthcare. Proficient in: Apache Hadoop, Spark, Cloudera CDH, Hortonworks, MapR Real-time data processing with Apache Hive and Elasticsearch Cloud Architecting & Data Lakes: Skilled in designing and implementing robust cloud solutions and data lakes that streamline data accessibility and analysis, supporting high-level decision-making processes. Business Intelligence & Analytics: Experienced in integrating BI tools and technologies like Splunk, Tableau, and OBIEE to transform raw data into valuable business insights. Industry Expertise: Telecom, Retail, Banking & Financial Services, Utilities, EducationApache Spark
Apache SupersetAmazon Web ServicesCI/CD PlatformGoogle Cloud PlatformCloud ComputingCloud MigrationMicrosoft AzureCloud SecurityData PrivacyData ManagementData Ingestion - $100 hourly
- 4.8/5
- (2 jobs)
AI and Cloud Data Engineer with over 15 years of Practical Experience in Banking and Networking Domain. Well-versed in defining requirements, designing solutions, and building solutions at an enterprise grade. A Passionate Programmer and Quick Troubleshooter. Strong grasp on Java, Python, Big Data Technologies, Data Engineering/Analysis and Cloud Computing.Apache Spark
Apache BeamApache FlinkData ScienceMicrosoft Power BIData MiningApache HadoopETLPythonData Extraction - $40 hourly
- 5.0/5
- (2 jobs)
Data Scientist with more than 14 years of full-time experience in delivering Machine Learning products. Delivered Data Science products and applications on NLP, Explainable AI as technical lead and as well as an individual contributorApache Spark
ScriptArtificial IntelligenceData InterpretationPython ScriptGitServerLinuxBigQueryMachine LearningData ScienceTensorFlowPythonRecommendation SystemR - $40 hourly
- 5.0/5
- (27 jobs)
✮✮✮✮✮ 5 Star Reviews✮✮✮✮✮ ✅ Upwork's TOP Rated Plus Expert ✅ 5+ years of Research Experience ✅ 3+ years of Industry Experience ✅ 2+ years of Teaching Experience Hi Folks, I am Dr. Jenish Dhanani, Ph.D. in Computer Science and Engineering. I am an expert in AI and a Big Data enthusiast, Practicing AI, ML, Data Mining, GPT-3/4, Natural Language Processing (NLP), and Big Data ( i.e., MapReduce and Spark) tools to solve potential and real-life problems. I also hold expertise in AI Agent and Agenting Workflow development, enabling the creation of advanced, autonomous systems for a wide range of applications I have good experience in Prompt Engineering of GPT-4, GPT-J (and Other GPTs), and the Jurasic-jumbo model, which I believe is crucial in domain-specific or Special Applications. I have previously developed GPT-3 based Textual entity extraction, Article, Paraphraser, essay writing, summarizer, etc. by Prompt Engineering. I have great experience with Web Application Development using Python, Django, Flask, and many other web development technologies. ➤ Key Skills:- =========== ✔ Python, Django, Flask, DRF, Selenium, MongoDB, MySQL, Postgres, etc. ✔ PyTorch, Keras, TensorFlow, TensorBoard, Jupyter, R ✔ NLP, Text Mining and Analytics, Text Embedding ✔ TF-IDF, LSA, LDA, Word2Vec, Doc2Vec, BERT, FastText, Glove, etc. ✔ Machine Learning and Deep Learning ✔ Neural Network, Deep Neural Network, Support Vector Machine, Random Forest, Decision Tree, etc. ✔ Computer Vision and Image Processing ✔ Hadoop, Spark, MapReduce, Incremental MapReduce ✔ Community Detection: CDLib (Louvain, SLPA) ✔ Scikit-Learn and SparkMLLib ✔ Paperspace ✨ I have extensive experience in developing projects for Classification, Clustering, NLP techniques, Text Summarization, Topic modeling, Sentiment Analysis, Recommendation Systems, etc, considering Traditional and Advanced Machine learning and Deep learning techniques. ✨ Also, I hold good expertise in scaling and designing distributed solutions considering distributed platforms like MapReduce and Spark. ✨ I have published 17+ research articles in the fields of AI, ML, Text analytics, and Big Data in reputed international conferences and journals. ✨ It did automatic Sentiment Analysis of Amazon Product Reviews and legal Document Recommendation systems, considering Distributed framework and Text embedding approaches that were proposed, developed, and implemented. ✨ I have also delivered expert lectures and Hands-On sessions on various topics such as Big Stream Data mining, Hadoop, Pig, Hive, Flume, Hadoop and MapReduce programming, and Sentiment Analysis. ✨ Also, I am a scientific research professional. I always love to build long-term relationships with clients. I will be very punctual, so will keep deadlines and deliver good results. I am looking forward to hearing from you. Best regards. Dr. Jenish DhananiApache Spark
Data MiningDjangoDocument AnalysisApache HadoopBig DataArtificial IntelligenceSentiment AnalysisFlaskMachine LearningData ScienceWord EmbeddingRecommendation SystemApache Spark MLlibNatural Language Processing - $35 hourly
- 5.0/5
- (20 jobs)
5+ years of experience in Big Dat Technologies like Spark, Hadoop, Hive, Sqoop, ADF, Databricks. 5+ years of experience in ELK Stack( Elasticsearch, Logstash and Kibana). Microsoft Azure Certified Data Engineer. Elasticsearch and Kibana Certified. MongoDB Certified Developer.Apache Spark
Microsoft AzureDatabricks PlatformPySparkMongoDBLogstashElasticsearchGrok FrameworkELK StackApache HadoopHiveBashSQLKibana - $60 hourly
- 5.0/5
- (16 jobs)
🏆 Top Rated Plus on Upwork 🏆 Associated with Techdome [Top Rated Plus | 99% Job Success] ✅ Fluent English ✅ Swift response and communications ✅ 6+ Years of experience ✅ 50+ Successful project deliveries Are you looking for a tech lead who can drive your project from concept to reality with cutting-edge solutions? Are you looking to propel your softwares to new heights? Feel free to reach out to me for a conversation. Hi, I'm Nayan a seasoned Tech Lead and Solution Architect at the helm of a dedicated team comprising outstanding UI UX designers, Project Experts, Product Experts and Tech Doctors. We are known as Techdome, a collective of creative minds, strategic thinkers, software engineers and tech doctors. Throughout the years, we’ve worked alongside global brands and innovative startups, helping them achieve growth and success by effortlessly innovating and helping our clients with cutting edge solutions. What do clients say about me and my technology team: ✨ Great tech lead and a wonderful team to work with! Delivered top-notch work! Everything exceeded our expectations. ✨ Nayan is exceptionally skilled and a true professional to work with. ✨ Nayan and his team were crucial in successfully launching our software project. Their technical expertise was outstanding. What does this mean for you? 🔥 You'll gain a true partner who prioritizes your success and helps you make informed decisions. 🔥 Access top-tier expertise in product development across various industries like fintech, blockchain, manufacturing, healthcare. Name any domain and we would be there to help you out. 🔥 You'll have a reliable partner and a supportive team that genuinely cares about the success of your business. 🔥 Our team brings over 150+ years of combined design experience, excelling in consultancy, designs, software engineering, projects, products. How can I help? 🔍 I can transform your ideas into actionable solutions through thorough business analysis, strategic planning, and innovative architecture design. 🔍 Streamline your development process with efficient system design, architecture planning, and rapid prototyping to optimize time and resources. 🔍 Validate your concepts before full-scale implementation with detailed prototypes and technical assessments for web and mobile applications. My Skills and Technologies: 🧑💻 Languages Python, Java, C#, JavaScript 🧑💻 Frameworks Node.js, React 🧑💻 DevOps Git, Jenkins, Docker, Kubernetes 🧑💻 Cybersecurity Encryption Protocols, Access Control Mechanisms 🧑💻 AI & Machine Learning TensorFlow, PyTorch, Scikit-learn, Spark, Fusion AI 🧑💻 Web Development WordPress, HTML, CSS, PHP 🧑💻 Databases MySQL, Oracle NoSQL, PostgreSQL, Mongo 🧑💻 Cloud Microsoft Azure, AWS 🧑💻 Project Management JIRA, Trello, Clickup, GIT Projects, Plane.so Ready to elevate your project? Let’s connect and explore how we can collaborate to achieve your goals. --- Keywords related to my expertise: Software Architecture Full-Stack Development Backend Development Frontend Development API Integration DevOps Engineering Cloud Computing Microservices RESTful APIs GraphQL Performance Optimization Scalable Solutions Database Design Data Migration Serverless Architecture Continuous Monitoring Technical Consulting System Integration Tech Stack Analysis Infrastructure as Code Automated Testing Unit Testing Integration Testing Security Best Practices Code Review Version Control Agile Methodologies Project Management Tools Cloud Services (AWS, Azure, GCP) Application Security Dependency Management Configuration Management Build Automation Business Process Automation Technical Documentation Continuous Deployment Fault Tolerance Data Analytics Big Data Technologies Artificial Intelligence Machine Learning Engineering Blockchain Development AI ML Solutions WebSockets Graph Databases NoSQL Databases SQL Databases Version Control SystemsApache Spark
Microsoft Power BIAmazon Web ServicesNode.jsBlockchain TokenizationData Warehousing & ETL SoftwareCryptographyAgile Software DevelopmentMicrosoft AzureBlockchain ArchitecturePythonDockerData MigrationData IntegrationKubernetes - $60 hourly
- 5.0/5
- (11 jobs)
Experienced Software Solutions Architect with a demonstrated history of working in the sports industry. Skilled in Amazon Web Services (AWS), DevOps, Data Structure, Apache Spark, Blockchain, Elastic Stack (ELK), Android App development and Data streaming. Strong engineering professional with a Master of Computer Applications (MCA) focused in Computer Software Engineering.Apache Spark
Amazon S3Amazon EC2AWS ElementalDockerContainerizationPostgreSQLTerraformReactNodeJS FrameworkCI/CDBlockchain ArchitectureDevOpsEngineering & ArchitectureAmazon Web Services Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.