Web Scraping Jobs

385 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Entry Level ($) - Est. Budget: $100 - Posted
I need someone to write a script that will 1) navigate to a web page, 2) select a class from a drop down menu, 3) insert a date from a text list in another field, 4) click a "Submit Search' button, 5) navigate to a new window containing the search results, 6) save the results page as an HTML file, 7) click a link in each of the multiple row of the results page, 8) save each new results page as an HTML file, 9) move on to the next row and repeat process until last row on the page, 10) check for additional pages listed in comb, 11) close results window, 12) return to search page, verify same class is selected, insert new date, press "Submit Search" and repeat process until done Chosen applicant will write necessary script and perform web scraping operation for approximately 115 dates. Complete specifications will be provided to selected qualified applicants.
Skills: Web scraping Data scraping Python
Fixed-Price - Intermediate ($$) - Est. Budget: $60 - Posted
Thank you. I would like to start with the following task: TASK: I need to find the best car repair shop or car mechanic in either Los Angeles County, California or Orange County, California, to Repair a Catalytic Converter & a Front Heather Pipe, or to Replace a Catalytic Converter & a Front Heather Pipe. To determine which car repair shop or car mechanic is the "best", use these three (3) criteria: 1. Price (Lowest Price - Cheapest Price), 2. Quality (Trustworthiness - are lots of Previous Customers Very Satisfied with their Work?), & 3. Speed (Fast - must Complete Full Repair/Replacement in 1 day assuming all Necessary Parts Ordered). Attached is a document explicitly explaining how to measure these 3 criteria, and negotiation tactics to reduce the price as much as possible. Be creative with which sources you use to find these repair shops and mechanics. You are the Web Research Expert. As an example, you can search the yellowpages.com for "Car Repair & Service" near Los Angeles, CA or Anaheim, CA, then call different shops with 5-star ratings, 4.5-star ratings, & 4-star ratings, and find out how much they charge and generate a custom price. The repair shop or mechanic can be anywhere in Los Angeles County, CA or Orange County, CA. Look at a map on the internet to see where these county boundary lines exist, so you can know which cities are in there, so you can know where you can research. Try lots of different cities to get an idea of which ones are going to provide the lowest prices with consistently great quality & speed. When you find a qualified mechanic or repair shop, please enter them into a spreadsheet. Include 1) name, 2) address, 3) price, 4) quality rating, 5) description of how you found this price / how the phone discussion went (this can be a simple description of a few words unless there were complications), 6) description of their recent negative reviews/ratings that you found, & 7) any other comments you may have. REMEMBER THEY ARE ONLY QUALIFIED IF THEY CAN COMPLETE THE REPAIR/REPLACEMENT IN ONE DAY'S TIME. I need 10-20 mechanics / repair shops that are qualified, with prices under $1,700 for the job. This assignment must be completed within 10 Hours. There are bonuses, so aim for more than 10-20 mechanics/shops if you start to get close to the BONUS levels. Please call me beforehand for an interview. P.S. There are bonuses of $100 up to $250 if you perform above and beyond the call of duty and find a truly amazing and unbeatable deal. Details are in the attached worksheet.
Skills: Web scraping Administrative Support Data Entry Data mining
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
Hello, We need a website Scraped for the productions that offer. the products options/attributes will be entered in a excel sheet. we have a sample Excel how it should be filled out. we want one sample product done for testing before starting full project. can share the website for scraping and excel file if you are interested in bidding on project. look forward hearing from you Thad
Skills: Web scraping Web Crawling Data Entry Data mining
Fixed-Price - Entry Level ($) - Est. Budget: $46 - Posted
i have a website very simple need to copy image, sku, description. make it into a file to put for shopify.>>>> to re upload merchandise.. very simple easy task
Skills: Web scraping
Fixed-Price - Intermediate ($$) - Est. Budget: $500 - Posted
I need to create a script or program that will help me automate my keyword research process using https://adwords.google.com/KeywordPlanner. Below are the primary steps involved: 1) On the 1st screen, I want to be able to input the city and state. 2) On the 2nd screen, what I would like to happen automatically is for the city that was previously indicated in tab 1, to be used as the basis for all keyword research. And then I would indicate which keyword group categories I want to perform keyword research for, based on a database where I could add and remove keywords. I also want the option to categorize and sub-categorize keywords, to help me easily find keyword groups to use. For example: a potential category could be Dentist, the Topic could be Cosmetic Dentistry, the sub-topic could be Invisalign, and then I would add a list of relevant keywords like Invisalign, Invisalign Cost, What is Invisalign, and Clear Braces (there will be a need for at least category > topic > sub-topic > keyword group). So an example would be if Los Angeles is listed as the location, and then I indicated I want to do keyword research for that geo-target combined with all keywords listed in the "Bad Breath" keyword group (which would be found at Dentist > Dentistry > Oral Hygiene > Bad Breath), your software would then concatenate all of the keywords in that list in normal and reversed order (i.e. los angeles bad breath cause, bad breath cause los angeles, bad breath causes los angeles, los angeles bad breath causes, etc.). Once the list of geo-targeted keywords is compiled, I want the software to run the keywords through the "Search for new keywords using a phrase, website or category" feature found at https://adwords.google.com/KeywordPlanner, but at the state level. Which means that the keywords would reflect Los Angeles, but the region in the tool would select California as the target location in Adwords.google.com. It'd also be great to have the option to include a URL to use for the "Your landing page" field. 3) Then I want the tool to automatically download and export as a CSV document all of the relevant keyword ideas data provided. Keep in mind that when I do keyword research, I'll want the ability to research numerous keyword groups for a website, so it's important the keyword data be segregated so that it's easier to go through the data and find useful keywords to target which are only specific to each keyword group. 4) Once the above is done, I need to manually review the keyword data for each keyword group and make a determination on which keywords I would like to track/target. If you can help automate the process of filtering the keywords based on specific conditions I state for each search (such as being able to filter the list and only see keywords that include the related geo-target in them, or any of the same keywords used in the searched keyword group, or even just any keyword relevant to what I type in), that could help speed up the process tremendously. Finally, I would like to export the chosen data as indicated on the "Final List" tab found at https://docs.google.com/spreadsheets/d/10EeZxvdlkdYKEdGsAPuodXn0IU3Op3PbHjGoyJAZ6H8/edit?usp=sharing. Keep in mind that for the landing page, I would want it to reflect the URL I input in step 2 above, otherwise I can populate it manually if I chose not to indicate one during the keyword research process.
Skills: Web scraping API Development Data scraping MySQL Programming
Hourly - Entry Level ($) - Est. Time: Less than 1 week, Less than 10 hrs/week - Posted
I need to get some data form a website.
Skills: Web scraping
Fixed-Price - Expert ($$$) - Est. Budget: $20 - Posted
We are developing a service that is designed to monitor and prevent employees from falling asleep during their shift, as well as monitor their concentration during long and repetitive tasks. This is particularly useful for staff members like security guards and doormen. We need statistics about the number of employees that fall asleep during their jobs. The dangers of this. How it effects companies, as well as other stats that may be helpful in selling this product. How concentration is effected by tiredness. We are looking for 10 separate statistics backed by research with source. Can be a simple word doc with the stats and the source.
Skills: Web scraping Data mining Internet research Research
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
Goal: Automate initial opportunity identification process. Problem: The initial opportunity identification process is tedious and time consuming. We subscribe to a site (fbo.gov) that publishes solicitation notices and documents on the web. We’ve built search engines based on key words and other conditions, which run on a daily basis. Those emails contain core data (see below) that we use to filter the opportunities into a Potential Opportunity Report. Title:              Switchgear Maintenance Sol. #:             2016-Q-66019 Agency:             Department of Health and Human Services Office:             Centers for Disease Control and Prevention Location:           Procurement and Grants Office (Atlanta) Posted On:          Aug 25, 2016 4:44 pm Base Type:          Combined Synopsis/Solicitation Link:               https://www.fbo.gov/spg/HHS/CDCP/PGOA/2016-Q-66019/listing.html That potential opportunity report is an excel spreadsheet that lists the title, opportunity type (solicitation, sources sought, etc.), document number, set-aside type, location, posting date, response date, and a link to the publication. From here, our business development team will then do a Go/NoGo meeting. In that meeting they will filter those opportunities into a Go (pursue) or NoGo (do not pursue). Needs: We need to minimize the time it takes to get to the Go/NoGo phase. I understand that it is possible to build a program that will (a) search the database (fbo.gov) for relevant opportunities, then (b) input the data of those opportunities into a spreadsheet similar to what I described above. If possible, I would like a way to exclude those publications that are classed “NoGo” from re-appearing in the opportunity report. Freelancers interested in this project should: (1) Briefly describe how they would solve this problem (2) Roughly estimate the time it will take to produce a solution (3) Describe past project similar to this
Skills: Web scraping Data mining Data scraping Microsoft Excel
Fixed-Price - Intermediate ($$) - Est. Budget: $100 - Posted
Need an experienced professional to scrape a website for at least 100.000 emails. Email list must be delivered clean from spam traps, bad bouncing mails etc. All emails must be live and verified! Will provide website url to interested parties prior to giving out the job to have an estimate of total emails and cost. Job cost must be fixed. Let me know if you need anything else. Thanks
Skills: Web scraping Data mining Data scraping