You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Data Scraping Jobs

238 were found based on your criteria

show all
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only

Fixed-Price - Est. Budget: $ 1,500 Posted
PerformanceFloormats.com Objective: to build a niche ecommerce website that sells custom automotive floor mats. Competitors ExampleWebsite: https://www.floormatsnmore.com Task Will Include: • Custom Web Design (BigCommerce) • Make / Model / Year << Integration (through 2016) • Data Scraping (All Weather Tech / Husky Liners) • Channel Advisor Integration • SSL Certificate Integration • Payment Gateway Set Up (Paypal / Authorize.net) • Marketing Assistance (Google Shopping Feed)

Hourly - Est. Time: Less than 1 week, Less than 10 hrs/week - Posted
I need someone who can help me build out a system to crawl roughly 5-6 different systems (which are all behind a login). The data then needs to be displayed by day, broken down by campaign, in a Google Spreadsheet. The whole system should be automated. Please let me know what experience you have with Import.IO with specific examples. Thanks Jeff

Hourly - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
I need to collect business e-mails from business directories (manta, hoovers, yellowpages, etc.) AND also collect e-mails from craigslist postings. NO MANUAL COLLECTION - I need someone that has bots and scripts that can automate this and collect e-mails in bulk, quickly.

Fixed-Price - Est. Budget: $ 300 Posted
### Expert Level ### Please reply with real world examples ?? If you have experience with a bloom filter please let me know. Hi I have a version of the application already built but isn't performing as I had hoped. I need a developer to help me improve the application in terms of performance and accuracy. Below are what I want to happen so be clear in your replies that you can perform this type of work. - Web scraping sort of works, but really needs to become multi-threaded and a lot more robust (currently breaks lots), the basic stack is below but I'm not adverse to other technologies being used. - The crawler aims to collect 100's millions of rows of data from numerous content networks, so the crawler needs to be able to manage that number of rows and the complications that bring to the table. - Post content to social networks, current app doesn't post in the correct format (weird categories) so that needs to be fixed. Basic headlines - Create multi-threaded...

Hourly - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
I need a data scraper to scour the internet and social networking sites for court reporters. I need the e-mail addresses and names of court reporters and court reporting agencies organized into an Excel spreadsheet. Total hours for this job will amount to 10 for the time being. Feel free to ask any questions. Thanks for checking out the job!

Hourly - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
Green Pasture Properties is seeking an experienced Excel Data Specialist for 2-5/hrs of work per week, with the potential for 5-10/hrs per week in the future. The work will include data mining, data scraping, data sorting, and data entry. I will supply you with large excel spreadsheets, typically with 10-20 columns and up to 1,000+ rows in some cases. Each spreadsheet is missing critical information that can only be found on the web. I will supply you with the exact website to find the information needed. You will simply need to take certain data and enter it into the correct row/cell within the existing spreadsheet. Experience with automation tools such as iMacros will be extremely helpful for this work. For my first spreadsheet, the workload should not take more than 2-3 hours. If you're able to complete this work proficiently, I will send you more spreadsheets to help with.

Hourly - Est. Time: 3 to 6 months, 10-30 hrs/week - Posted
Hello. I am hiring several freelancers to perform data entry work. The tasks you will be assigned will vary, however, you must have knowledge on using computer software such as Microsoft office. This will be reoccurring work with the weekly amount of hours varying however, the higher quality work you perform the more frequently you will be rehired. If you are provided with a new assignment it should be started relatively quickly. Please provide a Skype address so I can ensure you are suitable for this job.

Fixed-Price - Est. Budget: $ 100 Posted
Hey there, We’re looking for an agent to help us obtain new companies within our selected target industries, as well as contacts within those companies. The target industries are: - Business Intelligence - Customer Experience Management - Market Research - Social Media Monitoring - Voice of Customer These companies should be English speaking. Although you should discriminate geographically, these companies typically are located in North America, Europe, India, and China. We are searching for applicants who possess: - Strong and proven research skills - Detail oriented personality - Reliability - Proficiency with Excel You would receive a list of companies that are already known to us, so as to avoid any overlap during your search. Payment is done on a per company and per contact basis, meaning it’s gauged on how many companies and contacts you find. The length of the project should be continual and as you continue working us on future projects, the constraints...

Hourly - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
Attached you'll find the file (bestconnections-scrape.xlsx). The file contains 4 columns: - guid - longdescription - ebaydescription - ebay2description Something to note here is that if the value for any of the description fields (2, 3 or 4) match, then once the data is scraped, you just need to copy that over to the other column as well - no need to scrape it twice. The goal is to get the descriptions of the item out. For example, if you look at the longdescription data for row 15 (S-1B.) in an html viewer, the only thing we would want left over after the scrape is: << see attached file - scrape1.png >> Everything else must be deleted out. Keep in mind that there's several different variations of these descriptions. The main purpose here is to get the description of this product scraped out of the full template it lives inside. Another example would be row 813 (LM-BK-12 X 20.), if you pull the longdescription and paste it in the HTML viewer. The only thing that should...