You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Crawlers Jobs

56 were found based on your criteria {{ paging.total | number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("hourly") | number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("fixed") | number:0}})
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only
Hourly - Intermediate ($$) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
I do not have a list of companies to crawl, and therefore, the solution should include a crawler that will collect the websites of all companies in the USA. ... To start, we can just have New York City, but I would eventually want to scale to teh whole country This would basically be to take everything from examplecompany.com/jobs or examplecompany.com/careers or similar sites and organize all of the results in a database. The crawler should be easy to run on a daily or weekly basis.
Skills: Web Crawler Data scraping
Fixed-Price - Intermediate ($$) - Est. Budget: $500 - Posted
Looking for someone expert in scraping data from variation website.. and save data in mysql / csv . Script has to be python or php . If python, it should work with on linux server with lamp php website. If you are really good , I dont mind you offering full time job as I need 100s of scrapping stuff needed over next 3 months .. this job is for 6 websites .. but I might need some other small scrappers before I give big project please answer these 1. write 'ddonk' before application 2. let me know if you prefer php or python 3. mention what website you have scrapped ? google, linkedin, amazon, yellow pages ? 4. show me list to any web application that does scraping if you have build any 5. do you have full time job and part time freelancer ? or you are full time freelancer ?
  • Number of freelancers needed: 3
Skills: Web Crawler Data mining Data scraping Django
Fixed-Price - Expert ($$$) - Est. Budget: $50 - Posted
Hi Our Insurance brokerage needs someone to scrap the Contractors State License Board of California's website (www.cslb.ca.gov) to obtain the Workers Compensation data listed below and format into an excel spreadsheet. We are only looking for those businesses that have "active" licenses. We have an excel list already we can provide as an example. It just needs to be updated with current dates as it has been awhile since last updating. I plan to ask for an updated list every 2-3 months and would like to find someone I can consistently go to to update the list. Please let us know if you need any further details before applying and bidding. Contractors State License Number, Business Name, Classification Type (Plumbing, Landscaping, etc.), First & Last Name of Owner (contact), Policy Number, Current Carrier, Renewal Date, Address, City, Zip Code, State, Phone Number (s), and emails if available
Skills: Web Crawler Data mining
Fixed-Price - Intermediate ($$) - Est. Budget: $250 - Posted
I need a freelancer to develop a web crawler that scrapes data off the VRBO for specific geographic markets and stores listing data and calendar availability in an Excel format. ... Scraper must be able to meter itself to avoid getting refused by the website for too many requests to the server. I don't want my IP to get blocked! A. Crawler to crawl VRBO.com to identify all geographic destinations on the site B. ... Publish: Inventory statistics by market 1. Inventory Pull function -- Crawler goes to VRBO and scrapes all data about all properties for the user selected geographies --Examples of data for each listing includes: ----We need to track multiple levels (parents and children) of geographies (e.g., Hawaii -> Maui -> South -> Kihei) ----Description: Name of Property (e.g., Grand Champion), Listing Identification Number, Listing Name ----Property Type: Condo, House, 1 bedroom, 1,442 Sq. ft., etc.; ----Unit Number (for Condos; e.g., Grand Champion Unit #75) ----Number of bedrooms, number of bathrooms, how many people it sleeps ----Size of home ----Min Stay Requirements ----Low season cost per night, high season cost per night, low season cost per week, high season cost pet week, holiday cost per night, holiday cost per week, etc.
Skills: Web Crawler Data scraping Web scraping
Hourly - Expert ($$$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
We are looking for a developer / data scientist with experience in building web crawlers that seek to scrape and index data continuously. ... _encoding=UTF8&ie=UTF8&node=3017941&pf_rd_m=ATVPDKIKX0DER&pf_rd_s=merchandised-search-6&pf_rd_r=0JM08XCZ52G2S9XQKC2N&pf_rd_t=101&pf_rd_p=2265474122&pf_rd_i=502394 2. The crawler will crawl the page to find new products, read the content on the product page and extract the following: Product Name Sellers Name Number of Customer Reviews Number of answered questions Total number of Star Rating 3. ... Continuously / periodically (i.e. it doesn't need to be running all the time, but needs to be an automated process), the web crawler needs to continue to monitor all links to identify any new products that have come out then bring them into our database, without creating duplicates.
Skills: Web Crawler Data mining Data scraping Web scraping
Looking for the Team App?
Download the New Upwork Team App
Fixed Price Budget - ${{ job.amount.amount | number:0 }} to ${{ job.maxAmount.amount | number:0 }} Fixed-Price - Est. Budget: ${{ job.amount.amount | number:0 }} Open to Suggestion Hourly - Est. Time: {{ [job.duration, job.engagement].join(', ') }} - Posted
Skills: {{ skill.prettyName }}
Looking for the Team App?
Download the New Upwork Team App