Hire the best Machine Learning Engineers in Mississippi
Check out Machine Learning Engineers in Mississippi with the skills you need for your next job.
- $65 hourly
- 5.0/5
- (7 jobs)
I am a software developer and data professional with over five years experience. My business philosophy is to provide solutions that generate value for the client long after I deliver them. I'm currently undergoing rigorous study to better understand and integrate various technologies to offer more comprehensive support to my clients. I can help implement: - various types of automation, including quality assurance automation - certain cloud solutions with GCP, AWS, and Microsoft AzureML - data transformations - machine learning models - dashboards - command-line interfaces - financial analyses - Jupyter notebooks - spreadsheet solutions (Google Sheets and Excel) - various types of interactive visualizations - software modules (in particular, I'm currently learning to build Python modules in Rust for faster performance) I have formal training as an engineer up to the Master's level. I have training from past full-time roles as research engineer and data analyst. I attribute much of my current skills to ongoing self-study using online resources such as Packt and O'Reilly technology and business training. I am also developing my skills in Rust and online cloud services. As a research engineer, I developed experimental machine learning models with Python and wrote corresponding technical reports. These efforts were also the subject of my graduate work. As a data analyst, I collected and analyzed data from solar energy infrastructure projects and conducted external market research to determine future project viability in different regions. Since joining Upwork, I have assisted clients with ML and data engineering tasks. As mentioned earlier, I am currently training to be a full-stack solutions architect with both coding and strategic planning offerings.Machine Learning
Amazon Web ServicesQA AutomationGPT APIData VisualizationUnit TestingData AnalyticsRustML AutomationPyTorchpandasData SciencePython - $200 hourly
- 5.0/5
- (25 jobs)
🏆 Top 1% Expert-Vetted Talent 🏆 When you’ve got complex problems, you need a partner as invested in your project’s success as you are. And that starts with one key skill set that separates me from the rest: I will truly listen to you: your needs, your goals, and your unique circumstances. Hand your project over to me—whether it’s web development, Generative AI, or building new content—and I make it my personal mission to deliver a top-quality experience by being as invested in the project’s success as you are. My name is Dev Ramesh, and as lead of FullGen, I listen to all your needs and match you with the quality problem-solvers at my agency to get your projects done on time, on budget, and at a premium, first-rate quality. And that occurs no matter what your technical needs may be: —AI & Machine Learning: Generative AI (GPT, Gemini/Bard, Llama, Dalle, Mistral, Claude, Stable Diffusion) —Web & App Development: Front-End (HTML, CSS, JavaScript, React, Bootstrap, Vue.js) and Back-End (Node.js, Express, Laravel, Django, Python, Java, PHP) —Cloud & Database Solutions: Cloud Computing (AWS, Google Cloud, Azure, Heroku, Digital Ocean) and Database Management (MySQL, MongoDB, PostgreSQL) These are complex solutions, but the answers can be simple: you need the human element. That’s what I bring to each project. I’ll listen to everything you need and serve as your project’s guide to ensure it ends up in the right hands to ensure top-tier solutions. Testimonials from Previous Clients: “Delivered our requirements to the word” “The best freelancer I’ve ever worked with” “Extremely talented and went above and beyond all aspects of my request” Do you have a project in mind? Connect with me, and I’ll plug you in to the best match with this agency—all while catering to your specific needs, dialing up the right solution, and checking in to ensure a 100% smooth, best-in-class experience. - Dev RameshMachine Learning
Natural Language ProcessingAI Image GenerationAI ChatbotChatbotOpenAI APIGPT-4Artificial IntelligenceDatabase DesignFront-End DevelopmentBack-End DevelopmentPrompt EngineeringFull-Stack DevelopmentData ScienceGenerative AI - $35 hourly
- 5.0/5
- (4 jobs)
Extensive experience in all stages of healthcare IT and laboratory management, product life cycles, and team building/management.Machine Learning
HealthcareNeural NetworkDeep LearningArtificial IntelligenceArtificial Intelligence EthicsArtificial Neural NetworkApplied MathematicsPhysicsData MiningScriptingPythonTeam BuildingData EntryProject Delivery - $60 hourly
- 5.0/5
- (0 jobs)
Rust engineer with over 7 years of experience in data storage ecosystem. My major skills: - Blockchain Rust, Solidity, EVM, Tron, Ton, Layer 2, Solana data account, Off-chain system, Solana program, Zero knowledge proof, Merkle proof - Frontend Next.js, Typescript, Rust client program - Backend Node.js, MongoDB, PostgreSQL, Django, Rust-backend framework such as Rocket, Rust based substrate and Data storage My experience: I have developed a system using Rust and Solidity programming language including Solana program, Smart Contract, Rust backend, Rust client program and Solana data storage. I implemented a circuit that store AI trainning data devided into chunks to the Solana chain. Thank you for your kind attension and if you need a Rust enginner, please let me know.Machine Learning
Bot DevelopmentGenerative AICryptocurrency TradingLayer 2 BlockchainDApp DevelopmentBlockchain Development FrameworkAPI IntegrationAPI DevelopmentBack-End DevelopmentPythonSmart ContractSolanaRustNext.js - $35 hourly
- 0.0/5
- (0 jobs)
Hi, my name is Johannes. I am a results-oriented technology professional with a strong foundation in software development , Network Administration and a keen interest in cybersecurity. I possess strong programming skills in Python and Java, honed through rigorous practice on platforms like AlgoExpert. I have a solid understanding of security principles and have conducted research on unauthorized access and data theft. I have successfully completed projects from conception to deployment, demonstrating my ability to deliver high-quality work. I am eager to contribute my skills and enthusiasm to challenging projects.Machine Learning
NIST Cybersecurity FrameworkCybersecurity ToolCybersecurity MonitoringCybersecurity ManagementCyber Threat IntelligencePenetration TestingData ExtractionArtificial IntelligenceData Mining - $30 hourly
- 0.0/5
- (0 jobs)
I am a licensed Registered Nurse with a degree in Applied Science in Nursing, Associate Degree Nursing, earned from Coahoma Community College in 2008. My nursing career began in 1995 as a Licensed Practical Nurse (LPN) at a small rural hospital in Ruleville, Mississippi. In 1998, I transitioned to long-term care, gaining diverse experience across multiple roles, including urgent care nursing, hospice nursing, home care nursing, and clinic staff nursing. After getting married and starting a family, I decided to further my education and obtained my RN license in 2008. As an RN, I've held various roles such as hospice nurse, home health nurse, RN manager, contract nurse, and hospital charge nurse. Following five years in medical-surgical nursing, I faced personal health challenges, including rheumatoid arthritis and cancer, which required multiple surgeries and rehabilitation. I'm proud to say I've successfully managed these obstacles. Now, I am eager to explore new roles in my career. I have recently developed an interest in AI and remote work options, confident that my extensive experience and skills will be valuable in these fields. Professional Summary: * Licensed Registered Nurse with over 25 years of experience in Med/Surg, Hospice, Urgent Care, Nursing Home, Orthopedics, Acute Care and Geriatric Nursing. * Proficient in Electronic Health Record (EHR) Software such as Epic, Allscripts, & Cerner. * Worked with medical/surgical patients with neuromuscular disease and their families, patients with psych, chemical dependency or dual diagnosis, homeless population, and general population in various healthcare settings.Machine Learning
AI ChatbotAI Fact-CheckingNursingHealthcareInformation RetrievalMidrig EvaluatorArtificial Intelligence - $40 hourly
- 0.0/5
- (0 jobs)
Passionate and results-oriented Data Scientist with extensive experience in designing and implementing machine learning models and solutions to address real-world challenges. Worked to design models for Department of Treasury, Social Security Administration, FEMA, and JP Morgan Chase. Proficient in managing end-to-end data processes, from ETL integration to productionalized pipelines. Adept at communicating complex technical concepts to non-technical stakeholders and driving data-driven insights for actionable outcomes. Years of journalism experience, from serving as the editor of my award winning high school newspaper, to an assistant political editor of the Loyola Maroon, to a contributing writer for Big Easy Magazine.Machine Learning
Artificial IntelligencePython Scikit-LearnpandasJDBCProbability TheoryStatisticsData MiningLogistic RegressionNeural NetworkCloud ComputingAmazon Web ServicesSQLPython - $100 hourly
- 0.0/5
- (0 jobs)
Example major Software Engineering / Machine Learning accomplishments: * Forecast as a Service (FaaS) Description: Percolata provides a forecast as a service (FaaS) that gathers and forecast shopper traffic and transactions with 4x higher accuracy for retailers using weather forecasts, marketing calendars, and other data via proprietary deep learning technology. The service reduces labor cost by 2-5% while improving revenue by 1-5%. Percolata’s cloud-based forecaster also lowers inventory costs while improving marketing effectiveness. eCommerce retailers have also used FaaS to give third-party logistics companies the required forecast accuracy needed to gain promised service levels. Percolata's FaaS is a fully managed turnkey service. Work Done: Market-tested the FaaS concept, managed the development of the MVP, signed up over 55 different retail brands around the world, and managed the production rollout. Tech Stack: Google Cloud, Google SQL, BigQuery, Python, Scikit-learn, Pandas, Numpy, Linux, Papermill, Google Data Studio, Google AI Notebook, Google AutoML, Facebook Prophet, Tensorflow, Google Colab, Jenkins, Azure, AWS * Selling Team Optimization Description: Percolata's Selling Team Optimizer automatically generates labor schedules for retailers. By doing so, retailers get up to a 30% sales uplift using the same labor budget and resources. We do this by using sensors data to schedule the right number and composition of salespeople to handle the forecasted shopper profiles using proprietary deep learning technology. It is like Moneyball for retail associates. Work Done: Greg validated the market need, architected the machine learning solution, and successfully piloted the technology at several key accounts giving those retailers upwards of a 30% lift in revenue using the same labor budget. Tech Stack: Microsoft Azure, Tableau, Python, Scikit-learn, Pandas, Numpy, Linux, Papermill, Jupyter, Tensorflow * Traffic Sensor Description: We deployed inexpensive Computer Vision-based sensors in retail environments to capture physical-world big data—enabling for the capturing of unprecedented insights. For example, brands use this data to get predictions as to what merchandise is becoming less popular weeks/months sooner than by depending on point-of-sales transaction logs alone. This allows brands and retailers to reduce the number of markdowns that impact profitability. In addition, it also assures that future store openings are successful which is very difficult. Bay Sensors technology also allows retailers to know if the traffic/demographics at prospective future stores are appropriate for a successful store and how this new store will impact existing stores. These insights reduce the number of expensive store closures. Work Done: Led the concept validation, development (e.g., computer vision algorithms, enclosure, UX, and software) go-to-market strategy, fundraising, sales, and marketing of an Android-based traffic sensor for the retail market. Tech Stack: Microsoft Azure, Google Cloud, Tableau, Python, Scikit-learn, Pandas, Numpy, Linux, Kubernetes, Jupyter, Tensorflow, Pytorch, OpenCV, Jenkins, Google Cloud DataprocMachine Learning
.NET CoreC#PyCharmData ScienceOpenCVComputer VisionDeep LearningPythonC++ - $35 hourly
- 0.0/5
- (0 jobs)
Rakesh Nomula Data Engineer | Big Data & Cloud Solutions | Data Engineering Specialist With 7 years of hands-on experience in data engineering, I specialize in designing and implementing robust data solutions using both on-premises big data technologies and cloud platforms such as Azure and AWS. My expertise spans data warehousing, ETL, and large-scale data processing, using tools like Snowflake, SSIS, Informatica, and modern cloud technologies. I’ve successfully developed complex, performance-optimized queries that significantly reduce execution time, ensuring faster application delivery and seamless data processing pipelines. Throughout my career, I’ve built scalable, efficient solutions leveraging Azure Data Factory, Airflow, Azure Databricks, StreamSets, and Azure Functions. These solutions have been instrumental for top-tier clients in sectors such as pharmaceuticals, education, and logistics. Certifications & Skills: Microsoft Certifications: AZ-900, DP-900, DP-203 (Azure Data Engineer Associate) Snowflake Data Warehouse Certification Core Technical Skills: Big Data Platforms: Apache Spark, Databricks, Hadoop Cloud Technologies: AWS (Redshift, EMR, Lambda, S3), Azure (Synapse Analytics, Data Factory) ETL & Data Warehousing: Snowflake, SSIS, Informatica, Redshift Programming Languages: SQL, Python, PySpark Data Formats: JSON, CSV, Parquet, Relational Databases In my current role, I am primarily focused on the ETL and data processing side using Databricks, specifically with Python and PySpark. I am responsible for designing and optimizing data pipelines to handle student performance data for a data warehousing application. Current Project Overview: Data Extraction: We extract data from both on-premise and cloud sources, primarily in JSON format. The data is loaded into a cloud-based data lake. Kafka Integration: The data is streamed into Kafka topics through a Python application. Transformation: I am responsible for flattening deeply nested, complex JSON files into a relational format. This involves applying business rules, data aggregations, and transformations, particularly for handling incremental data loads and slowly changing dimensions. Data Warehousing: The transformed data is then loaded into Databricks Lakehouse and Redshift for warehousing. Reporting: The Analytics team uses Power BI to generate reports from the data warehouse. Orchestration: I manage the scheduling and orchestration of ETL pipelines using Apache Airflow. I pride myself on being pragmatic, dependable, and results-driven, with a constant drive to learn and innovate. I am always looking for new ways to optimize processes and contribute to the success of my team and organization.Machine Learning
InformaticaFivetranSpring BootJavaSalesforceBusiness Process AutomationBig DataPySparkMicrosoft AzureAWS ApplicationPythonData ExtractionArtificial IntelligenceETL Want to browse more freelancers?
Sign up
How hiring on Upwork works
1. Post a job
Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.
2. Talent comes to you
Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.
3. Collaborate easily
Use Upwork to chat or video call, share files, and track project progress right from the app.
4. Payment simplified
Receive invoices and make payments through Upwork. Only pay for work you authorize.