Web Scraping Jobs

424 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Entry Level ($) - Est. Time: Less than 1 week, Less than 10 hrs/week - Posted
I have an existing excel spreadsheet that needs to be re develop into a more functional one to do the following see atttache excel file and screen shot I use google hangouts to communicate you must too. basicly we are filing physical folders into physical cabinets. and We need to track the location of these folders , what cabinet location they are in and if they are pulled out, track who checked them out. currently we use excel with many sheet tabs fo each file cabinet ( History, august, july, pending, dereg, etc) then we have the same columns on each tab These are the columns headers "case #", "name", Assigned worker, Status, beg/end date, TOdays date, name of person doing data entry, notes Every row is an entry for each unique case # so i supposed this would be the KEY I would like todays date to be automatic whenver a new row is added or a change is made to the status or if a folder is checked out or brought back in also would like the name to be pulled in from windows ( vba can do this) also there are some data validation for drop downs in some of the columns I will update this on the exact data validation. Another wish would be some conditional formatting. if a file is "OUT" then maybe highlight the whole row yellow This will become very large lots of rows. we had broken it down by tabs for each location but i think if we just add the locaiton as a colunn then we could use pivot tables to view all the files by locaiton and by in or checked out at any glance let me know how long you think this would take i can email you the current spreadsheet to see what you are workign with or if you want to start from scratch This will be put on a network drive and to be opened by several people that can update and make changes simoltaneously too
Skills: Web scraping Excel VBA MySQL Administration MySQL Programming
Fixed-Price - Intermediate ($$) - Est. Budget: $20 - Posted
We need details from this https://bookamhs.alaska.gov/book/journey/journeySearch/ this site and update the data in to excel sheet. The sample excel data format will be provide for selected freelancer. The work will be complete within few hours, Do not bids if you not available for next 4 hours
Skills: Web scraping Data Entry Data mining Data scraping
Hourly - Entry Level ($) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
I'm looking to hire a developer to scrape a public website using PHP or Python, or another suitable language. The site being scraped uses a unique URL for each page of content. The program to be developed will accept a URL for one of these pages. The output should be a file containing the extracted text from the page, and any images retrieved from the page. Details regarding the site and the expected output will be provided for applicants to review upon request.
Skills: Web scraping
Fixed-Price - Entry Level ($) - Est. Budget: $100 - Posted
Need someone to find contacts and their contact information from the companies from 2 PDF's containing company names. Please see attached EXCEL doc for example of the format. Pull a maximum of 3 contacts PER company. See Desired Titles below for the titles that are wanted. Desired Titles: President, Owner, Founder, CEO, CRO, CFO, CHRO, COO, Vice President of HR, Vice President of Human Resources, VP of HR, VP of Human Resources, Controller.
Skills: Web scraping Data Entry Data mining Data scraping
Fixed-Price - Intermediate ($$) - Est. Budget: $100 - Posted
Looking for an experienced developer who is familiar with keepa.com's API. Right now I need to extract Best Sellers on Amazon and I believe it is only one process. Very simple to set up and the information is right here: https://keepa.com/#!discuss/t/request-best-sellers/1298 More info here: https://keepa.com/#!discuss/t/how-our-api-plans-work/410 Later if I'm satisfied with the work, I will also need to do more advanced API development of keepa to getting more in-depth work like extracting the rest of the info from: https://keepa.com/#!discuss/t/requesting-products/110 including price history and such. BUT THIS WORK WILL BE LATER. For now, I just need the very simple work of extracting Best Sellers. Note that developer must sign up for the keepa 1 token per minute and also have a key from Amazon AWS. This fee of course will be paid to you, either before, if needed, or after the work is done. No UI needed. I just need the raw data. keepa.com says that up to 30,000 ASINS can be extracted per main category using their API. I will need all main categories extracted (I believe there's about 20-30 main categories). All I need is the raw data that the API will be extracted to. So basically the 30,000 ASINs in order by rank (keepa says they already list it in order) per category on an excel or txt file. This means that the work can be done on developer's own computer and all that's needed is to just send the txt/xls file to me.
Skills: Web scraping Data mining Data scraping
Fixed-Price - Intermediate ($$) - Est. Budget: $200 - Posted
I am listing this as a 1-time project, but it could grow, depending upon how successful the idea is. A qualified candidate will likely already have a similar script in his wheelhouse. I am a recovering developer, and am not intimately familiar with PowerShell, but this project needs to be done in PowerShell. For the purpose of setting scope, the operating environment will be: I. a Windows 10 client, II. 'en-US' culture setting III. running in the United States, IV. with PowerShell Version 5.0.10586.494 or later. This is my first Upwork posting, but not my first technical project. As a recovering developer, I may call upon additional PowerShell wizards to vet your skillset if I am uncertain. Phase 1 (this job): Using PowerShell, create a script (the "Program") and source code to do the following: 1. Query https://www.auction.com --> Residential --> "Mecklenburg County, NC" --> "Foreclosure Sale" 2. compile and parse the resulting page(s) of results/objects into a single CSV file. Columns: a. Address b. City c. State d. Zip Code e. Case Type (Foreclosure, Bank Owned, etc) f. Property Type (SFR, Condo, Townhome, Land, etc) g. Number of Bedrooms h. Number of Bathrooms i. Total Square Footage j. Case Number k. Estimated Open Bid l. Date of Auction m. Time of Auction n. Status of Auction (Scheduled, Postponed, Cancelled, Auctioned) 3. Complete the task within 10 days of acceptance. 4. Provide a daily status report, noting difficulties encountered, and proposed or enacted solutions. A look forward If this project is completed successfully and to our satisfaction, we would like to engage you further to develop this utility as follows: Phase 2 (not this job): * Add ability to store results in database Phase 3+ (not this job): * Add other site-specific features: ~ Add other cities ~ batch several counties together ~ include "Bank Owned"
Skills: Web scraping Data scraping Windows PowerShell
Fixed-Price - Intermediate ($$) - Est. Budget: $15 - Posted
I need someone to research all the dental offices within three specific counties in Florida, then put their name of doctor or office, address, city, state, zip and email address and website address into an Excel spreadsheet for me. Must have all of those separate columns. The Florida zip codes are: Volusia County: 32130, 32129, 32132, 32759, 32763, 32764, 32141, 32774 Flagler County: 32110,32136,32143,32135,32137,32142,32143,32164 Seminole County: 32762,32765,32766,32771,32773,32772,32779,32795,32701,32707, 32708, 32714, 32719, 32718,32730, 32732,32747, 32750,32752 Please obtain the addresses through your traditional mining methods. But also search this site for additional information: http://www.mouthhealthy.org/
Skills: Web scraping Data Entry Data mining Data scraping