Hire the best Data Extraction Specialists in Oklahoma

Check out Data Extraction Specialists in Oklahoma with the skills you need for your next job.
  • $50 hourly
    - Backend / Cloud - Python ● Flask ● MySQL ● Digitalocean ● GIT ● Docker - Frontend - HTML5 ● CSS3 ● JavaScript - Hardware / OS - Linux ● Raspberry Pi - Me - I'm a life long learner always expanding my skill set and looking for projects to drive me forward as a developer and as a human being. I write appropriately commented, object oriented, PEP8 compliant code for project types including: web automation ● file processing ● consuming / creating REST APIs. Right now I'm most interested in building out data subscription services and working on light-weight full stack web applications, but I could be open to anything. Get in touch if you think I might be a good fit for your project!
    vsuc_fltilesrefresh_TrophyIcon Data Extraction
    Flask
    Data Scraping
    Web Crawling
    Beautiful Soup
    Data Mining
    Python
  • $85 hourly
    RELEVANT EXPERIENCE Trisura Specialty Insurance Company, Oklahoma City, OK • Title: Senior Software and Automation Developer - 7/1/2023 to present • Responsible for Database Object Development, ETL Processing/Data Automation and User/Testing Automation. o Communicating with BAs and End Users • Gathering Requirements • Identifying gaps in processing • Creating solutions • Building User Stories, Assigning Epics, and Story Points via JIRA o Build ETL processes in Safe FME/Server and Python • Data Mapping, Data Conversions, Data Migrations (Upload, Download, API, New Source, and DataVerse all to SQL Server), Automated Reporting, Automations (with API), Server Apps (Workspace and Automation), Gallery Apps, and Scheduling. o Build Database Objects: Tables, Stored Procedures, and Schemas • Tables: setting Primary Keys, Foreign Key Constraints, determining data types (via company coding standards), and programmable fields (if necessary) - via scripting • Stored Procedures: Update, Delete, Insert, and Trigger. • Schemas: Depending on data needs, determining proper schema or creating new schemas to handle data objects. o Build Automation Objects using: UiPath, PowerAutomate (and Desktop), and Playwright • Automated user interaction with internal or external applications using UiPath • Automated file movement (from Network Locations to SharePoint), notifications (email or teams), database event triggers and data flows to PowerApps using PowerAutomate • Automated internal web application functionality testing using Playwright. Used for new releases of UI Updates, new features or business logic testing • Title: Senior IT Business Analyst Supervisor - 3/1/2021 to 6/30/2023 • Responsible for Premium, Claim and Collected Premium Reporting Data o Communicating with partners on their submitting reports o Communicating errors and validation issues to partners o Validating for data integrity • Responsible for ETL Processes/Data Automation o Build ETL processes in Safe FME o Validated Reporting Data o Created new validations for business logic using existing data or data points within reporting schemas o Automated Cleaning and Formatting Reporting Data into internal Database Schema Needham Energy Data Solutions, Oklahoma City, OK 3/27/2017 to 3/1/2021 • Title: Business Analyst • Responsible for data management for Needham Energy Data Solutions o Building the databases for different types of Oil and Gas data products o Built tables to hold data and their attributes: setting data types o Updating/Inserting/Deleting data using SQL o Using Master Data Services in SQL to apply an algorithm to match and update data o Created Stored Procedures for reports and matching jobs o Used queries to find incontinences in data o Creating database schemes • Responsible to data cleaning and formatting o Used VBA code to clean data and to format files for SSIS import • Responsible for creating reports in SSRS for internal and client use • Responsible for documenting data processes and data types o Using Visio for data process flow o Created process documentation for each type of data being stored • Used Data Miner to scrap public websites o Gathering information on possible clients • Reaching out and building relationships with clients o Acting as a sales representative for our data o Acting as a point of contact for internal clients • Built an automated process for extracting respondent data o Used UiPath to automate the processes of connecting different applications; desktop, native, and online database o Used Abbyy Finereader templates to extract respondent data from different OCC applications. o Built and applied a C# .net application parser to parse raw respondent data • Managed server migration o Used Synology NAS cloud sync to files and folders to AWS S3 and then synced the data back down to our new Synology NAS. • Languages: SQL, C#, Python • Tools/Programs: Access, Excel (with VBA Developer), Word, Visio, JIRA, Microsoft SQL Server, SSRS, SSIS, Data Miner, Abbyy Finereader 14, UiPath, VSCode, FME, Playwright, DevOps, PowerAutomate, PowerAutomate Desktop, Lucidchart • Methodologies: Agile
    vsuc_fltilesrefresh_TrophyIcon Data Extraction
    Microsoft Access Programming
    Stored Procedure Development
    Microsoft PowerApps
    ETL
    Microsoft Power Automate
    ETL Pipeline
    Python
    Data Scraping
    SQL
    Automation
    ABBYY FineReader
    UiPath
    PDF Conversion
    Data Entry
  • $35 hourly
    Professional Journey: With 5 years of dedicated service in IT, I specialize in deciphering complex data landscapes using sophisticated tools such as Python, SQL, SQL Server, and AWS ️, facilitating a clearer understanding of organizational data assets. Data Exploration Expertise: Delve into a rich reservoir of data , where I expertly navigate databases like SQL Server, Snowflake, and AWS S3, meticulously retrieving insights akin to a seasoned explorer charting unexplored territories ️. ETL Mastery: Think of me as a digital artisan , refining raw data into actionable intelligence using the precision of SQL, Python, Knime, and ETL tools, ensuring seamless transitions from raw inputs to refined outputs. Architect of Data Flow: Orchestrating the flow of data like a conductor , I ensure its timely delivery for reporting and analysis, orchestrating a harmonious symphony of information crucial for informed decision-making. Visualization Virtuoso: Witness the transformation of numbers into
    vsuc_fltilesrefresh_TrophyIcon Data Extraction
    Agriculture & Mining
    Artificial Intelligence
    Machine Learning Model
    Machine Learning
    Data Analysis
    Analytical Presentation
    ETL Pipeline
    ETL
  • $25 hourly
    I'm a web developer experienced with frontend, backend, and database creation/management in a small team of 3 or less or solo projects. Main skills: • Python3, Flask, PostgreSQL, JavaScript, HTML, CSS, Express.JS, Node.JS, Git, Docker, Bash Projects: • Space_Bar - Worked on a team of 3 to create a website selling drinks on the galactic market. My main achievements for this project were creating an ETL pipeline, setting up a Backend and Database, getting a Docker setup to run all of it together for redeploying when necessary. • Atlas-Files_Manager - Worked on a team of 2 to create a file management system using mostly Python and JavaScript. We made a simple frontend GUI to be able to view our files that are served via APIs from our backend connected to our SQLite database. • Scales and Slumbers - Worked on a team of 3 to create a simple shop website for purchasing beds designed for dragons. I took the lead on the backend so that we could serve all the products to the frontend which included: descriptions, titles, prices, and images.
    vsuc_fltilesrefresh_TrophyIcon Data Extraction
    API Development
    ETL Pipeline
    Bash Programming
    Front-End Development
    Back-End Development
  • $24 hourly
    Hello, my name is Jake! Throughout my life I have accomplished a lot of things that I never thought I would. From opening and running my own business, to flying a plane. Through it all though, my main skill that has opened the door to all of these opportunities to me has been social media. In the year 2019, I grew my personal social media platform to a following of 18,000+ people. This audience allowed me to pursue an entrepreneur route when it comes to sustaining myself. I started out by making clothing, such as sweatshirts, and hats. Well, with that I've also learned how to track inventory on excel. After all, that was just a little taste of what was to come. I eventually opened a food truck business called "Boka Bowls", in the summer of 2022. I to this day still work and run this food truck, and have grown this business's social media account just like I have my own personal one. To summarize my skills all in one list: * Social media expert * Marketing expert * Business management * Strive in a leadership role * Excel casual with a promising upside * Content creation * Photography/Videography * Social media growth/advertisement
    vsuc_fltilesrefresh_TrophyIcon Data Extraction
    Social Media Audience Research
    Social Media Advertising Tracking
    Social Media Advertising Analytics Report
    Social Media Advertising Analytics
    Social Media Advertising
    Social Media Ad Campaign
    Social Media Account Setup
    Social Media Account Integration
    Market Planning
    Market Analysis
    Freelance Marketing
    Marketing
    Agriculture & Mining
    Microsoft Excel
  • $20 hourly
    SUMMARY Experienced Azure and Snowflake Data Engineer, MS SQL Server Database DBA/Engineer, Databricks Engineer with 5 years of expertise. Involved in designing, implementing, and optimizing data solutions and managing complex databases. Adept at leveraging advanced tools and techniques to drive data driven decision-making and ensure high performance and reliability.
    vsuc_fltilesrefresh_TrophyIcon Data Extraction
    Transact-SQL
    Azure Service Fabric
    Databricks Platform
    Snowflake
    SQL Server Integration Services
    Microsoft Azure Administration
    Microsoft Azure
    Azure Cosmos DB
    Microsoft Azure SQL Database
    Microsoft SQL Server
    Data Mining
    ETL Pipeline
    ETL
  • Want to browse more freelancers?
    Sign up

How hiring on Upwork works

1. Post a job

Tell us what you need. Provide as many details as possible, but don’t worry about getting it perfect.

2. Talent comes to you

Get qualified proposals within 24 hours, and meet the candidates you’re excited about. Hire as soon as you’re ready.

3. Collaborate easily

Use Upwork to chat or video call, share files, and track project progress right from the app.

4. Payment simplified

Receive invoices and make payments through Upwork. Only pay for work you authorize.

Trusted by 5M+ businesses