You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Web Scraping Jobs

84 were found based on your criteria

show all
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only

Fixed-Price - Est. Budget: $ 100 Posted
I need someone to scour LinkedIn using keywords 'SOLAR' or 'WIND' or 'RENEWABLE ENERGY' and build a master spreadsheet with every single entry, with fields containing data gathered from LinkedIn. The second part of this task will be for me to highlight from this list which names I need email addresses for, which are to be scraped from google.

Fixed-Price - Est. Budget: $ 30 Posted
Newaya is a trade-in program that pays people for their old cellphones and tablets! We like to keep very close track of what our competitors are offering for the same devices. We need someone to research our competitor's pricing every two weeks and fill out a Google Doc with the values. This typically takes 1.5 hours, so we're offering $15 each time you do the task.

Fixed-Price - Est. Budget: $ 300 Posted
Hi, Pretty simple scraping project but the output needs to be an application that I can install and run myself (or that runs in a hosted environment). The application itself will: a) authenticate with LinkedIn as a user inputs for this step: username, password, verification code (may be required if running as a web app) b) visit a specific user's profile input for the step: target profile URL c) scrape all the 1st degree connections for that user (this requires paginating through the contacts...) outputs for this step (output as csv for all contacts for the target): Name LinkedIn profile URL Company Current Role Company URL (if available) Work history Requirements: - this can run as a web app OR as a desktop app (I'm open to either) - if creating a desktop app, it needs to run on OSX Yosemite or higher - if creating a web app, you'll need to set up the server environment and provide a secured web interface for inputting the parameters

Fixed-Price - Est. Budget: $ 500 Posted
I am looking to have a tool made, that can be access through a webpage (so it could be a .php script), which scrapes Google's Search Results page for specific keywords, then runs an analysis on it. This is how the tool would work: 1) User goes to webpage of tool, and sees a text box. They enter a keyword (for example 'diet pills' and click okay 2) the tool takes that keyword (diet pills) and searches Google.com. *Note: Google's API has a very limited numbers of queries that can be searched per hour, so this may need to be done using a screen macro, or a VPN to switch up IP's 3) The tool then clicks in to all 10 of the websites in the search results on page 1, and scrapes the content on those web pages. 4) The tool then runs a keyword density on all of the content scraped from the 10 webpages 5) The tool delivers a list of the most commonly used words from all of the content it scraped.

Fixed-Price - Est. Budget: $ 100 Posted
I want you to scrape styleseat.com with WebHarvy There is about 300.000 professionals that you must scrape. When you are finish i will make test for to see that you have scrape all. See video: https://www.youtube.com/watch?v=lOwcCQsK_xk Give me your bid.

Fixed-Price - Est. Budget: $ 50 Posted
I require a successful candidate to generate a list of businesses in Utah, United States.The list should contain at least 10,000 entries. I require the following information to be included in an excel spreadsheet format: Business Name | Website | Owner Name | Email | Address | City | State | Zip Code | Phone If this task is completed in a professional manner we will expand to additional states.

Fixed-Price - Est. Budget: $ 300 Posted
### Expert Level ### Please reply with real world examples ?? If you have experience with a bloom filter please let me know. Hi I have a version of the application already built but isn't performing as I had hoped. I need a developer to help me improve the application in terms of performance and accuracy. Below are what I want to happen so be clear in your replies that you can perform this type of work. - Web scraping sort of works, but really needs to become multi-threaded and a lot more robust (currently breaks lots), the basic stack is below but I'm not adverse to other technologies being used. - The crawler aims to collect 100's millions of rows of data from numerous content networks, so the crawler needs to be able to manage that number of rows and the complications that bring to the table. - Post content to social networks, current app doesn't post in the correct format (weird categories) so that needs to be fixed. Basic headlines - Create multi-threaded...

Fixed-Price - Est. Budget: $ 40 Posted
I run an educational listserv for Arabic language learners. I am requesting a utility that captures the (a) title, (b) summary, (c) URL Link, and (d) site title from all news articles published on the *prior* day for these three URLs which share the same DOM structure: 1) http://www.bbc.com/arabic/middleeast 2) http://www.bbc.com/arabic/worldnews 3) http://www.bbc.com/arabic/business This utility should run once daily (preferably after midnight GMT) and output clean data into an RSS feed. Remember that no pictures or full articles are necessary, and that this should only collect articles timestamped for the day prior rather than the current day's news. Sorry that I don't have a big budget but I do make sure to leave very positive reviews.

Fixed-Price - Est. Budget: $ 100 Posted
Hey there, We’re looking for an agent to help us obtain new companies within our selected target industries, as well as contacts within those companies. The target industries are: - Business Intelligence - Customer Experience Management - Market Research - Social Media Monitoring - Voice of Customer These companies should be English speaking. Although you should discriminate geographically, these companies typically are located in North America, Europe, India, and China. We are searching for applicants who possess: - Strong and proven research skills - Detail oriented personality - Reliability - Proficiency with Excel You would receive a list of companies that are already known to us, so as to avoid any overlap during your search. Payment is done on a per company and per contact basis, meaning it’s gauged on how many companies and contacts you find. The length of the project should be continual and as you continue working us on future projects, the constraints...