You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Web Scraping Jobs

156 were found based on your criteria

show all
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only

Hourly - Est. Time: Less than 1 week, Less than 10 hrs/week - Posted
I need someone who can help me build out a system to crawl roughly 5-6 different systems (which are all behind a login). The data then needs to be displayed by day, broken down by campaign, in a Google Spreadsheet. The whole system should be automated. Please let me know what experience you have with Import.IO with specific examples. Thanks Jeff

Fixed-Price - Est. Budget: $ 50 Posted
I require a successful candidate to generate a list of businesses in Utah, United States.The list should contain at least 10,000 entries. I require the following information to be included in an excel spreadsheet format: Business Name | Website | Owner Name | Email | Address | City | State | Zip Code | Phone If this task is completed in a professional manner we will expand to additional states.

Hourly - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
I need to collect business e-mails from business directories (manta, hoovers, yellowpages, etc.) AND also collect e-mails from craigslist postings. NO MANUAL COLLECTION - I need someone that has bots and scripts that can automate this and collect e-mails in bulk, quickly.

Fixed-Price - Est. Budget: $ 300 Posted
### Expert Level ### Please reply with real world examples ?? If you have experience with a bloom filter please let me know. Hi I have a version of the application already built but isn't performing as I had hoped. I need a developer to help me improve the application in terms of performance and accuracy. Below are what I want to happen so be clear in your replies that you can perform this type of work. - Web scraping sort of works, but really needs to become multi-threaded and a lot more robust (currently breaks lots), the basic stack is below but I'm not adverse to other technologies being used. - The crawler aims to collect 100's millions of rows of data from numerous content networks, so the crawler needs to be able to manage that number of rows and the complications that bring to the table. - Post content to social networks, current app doesn't post in the correct format (weird categories) so that needs to be fixed. Basic headlines - Create multi-threaded...

Fixed-Price - Est. Budget: $ 40 Posted
I run an educational listserv for Arabic language learners. I am requesting a utility that captures the (a) title, (b) summary, (c) URL Link, and (d) site title from all news articles published on the *prior* day for these three URLs which share the same DOM structure: 1) http://www.bbc.com/arabic/middleeast 2) http://www.bbc.com/arabic/worldnews 3) http://www.bbc.com/arabic/business This utility should run once daily (preferably after midnight GMT) and output clean data into an RSS feed. Remember that no pictures or full articles are necessary, and that this should only collect articles timestamped for the day prior rather than the current day's news. Sorry that I don't have a big budget but I do make sure to leave very positive reviews.

Hourly - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
Green Pasture Properties is seeking an experienced Excel Data Specialist for 2-5/hrs of work per week, with the potential for 5-10/hrs per week in the future. The work will include data mining, data scraping, data sorting, and data entry. I will supply you with large excel spreadsheets, typically with 10-20 columns and up to 1,000+ rows in some cases. Each spreadsheet is missing critical information that can only be found on the web. I will supply you with the exact website to find the information needed. You will simply need to take certain data and enter it into the correct row/cell within the existing spreadsheet. Experience with automation tools such as iMacros will be extremely helpful for this work. For my first spreadsheet, the workload should not take more than 2-3 hours. If you're able to complete this work proficiently, I will send you more spreadsheets to help with.

Fixed-Price - Est. Budget: $ 100 Posted
Hey there, We’re looking for an agent to help us obtain new companies within our selected target industries, as well as contacts within those companies. The target industries are: - Business Intelligence - Customer Experience Management - Market Research - Social Media Monitoring - Voice of Customer These companies should be English speaking. Although you should discriminate geographically, these companies typically are located in North America, Europe, India, and China. We are searching for applicants who possess: - Strong and proven research skills - Detail oriented personality - Reliability - Proficiency with Excel You would receive a list of companies that are already known to us, so as to avoid any overlap during your search. Payment is done on a per company and per contact basis, meaning it’s gauged on how many companies and contacts you find. The length of the project should be continual and as you continue working us on future projects, the constraints...

Hourly - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
Looking for SEVERAL candidates to help collect e-mail addresses of U.S. companies in the "Business Financing" industry AND submit contact-us forms (bots). You would need to generate a spreadsheet with e-mail addresses ONLY which will be verified for valid e-mail information. Recommended sources to scrape/collect include: -Google Search -LinkedIN -Business Directories (manta, hoovers, jigsaw, lead411... and others) -Craigslist job postings Keywords for the industry include: -Merchant Cash Advance -Business Cash Advance -Alternative Business Financing -Working Capital Loans

Fixed-Price - Est. Budget: $ 500 Posted
Hi, Pretty simple scraping project but the output needs to be an application that I can install and run myself (or that runs in a hosted environment). The application itself will: a) authenticate with LinkedIn as a user inputs for this step: username, password, verification code (may be required if running as a web app) b) visit a specific user's profile input for the step: target profile URL c) scrape all the 1st degree connections for that user (this requires paginating through the contacts...) outputs for this step (output as csv for all contacts for the target): Name LinkedIn profile URL Company Current Role Company URL (if available) Work history Requirements: - this can run as a web app OR as a desktop app (I'm open to either) - if creating a desktop app, it needs to run on OSX Yosemite or higher - if creating a web app, you'll need to set up the server environment and provide a secured web interface for inputting the parameters

Fixed-Price - Est. Budget: $ 1,000 Posted
We are looking for someone to help create an AirBnB web scraping plugin for Wordpress that imports airbnb listings based on zip code, city name or room type. The database fields, backend inputs and front end output are already created using the WP Residence WordPress theme. The plug should scrape the specific information from the AirBnB listings and import it into the correct Wordpress database tables/fields. The import will be a manual process where you enter either the zip code or city in wordpress (for example > https://www.airbnb.com/s/Seattle--WA), it should then query AirBnB and get all available results for the location. Once all the results have been discovered, it should confirm if a certain number of listings will be imported and import each unique listing (for example https://www.airbnb.com/rooms/7203408?s=JzNU). It should also have check if the listings has previously been imported and skip those. Import options should include importing listing based on "Room...