You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Data Scraping Jobs

307 were found based on your criteria {{ paging.total | number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("hourly") | number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("fixed") | number:0}})
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only
Hourly - Expert ($$$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
I'm looking for an expert to crawl and scrape selected high level sites in order to produce structured data outputs in Excel sheets. I would like to be able to combine data sources as well across multiple sites to create high quality structured data CSVs so that I can eventually upload to website. I'm looking for someone familiar with Freebase, Factual, social media, and high level web scraping with structured data applications. It may also be required to integrate the data sources with 3rd party APIs for business analysis purposes. All software and products needed will be paid for by me.
Skills: Data scraping Data mining Microsoft Excel Search Engine Optimization (SEO)
Fixed-Price - Intermediate ($$) - Est. Budget: $35 - Posted
I need someone to scrape data from an retailer's website and then copy/paste the information into a spreadsheet. The basic steps to be performed are as follows: 1. Visit the retailer's website and enter a pre-defined query string. 2. The results from the search query will be a listing of the retailer's products. 3. Within the search results, access the webpage for a product by selecting it's hyperlink. 4. Copy/paste information about the product into a spreadsheet. 5. Repeat the copy/paste activity for the top 100 products listed in the search results
Skills: Data scraping Data Entry Google Docs Internet research
Hourly - Entry Level ($) - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
We need to scrape 400 data points each week off of Google search for a test project for our sales team. We require someone who can take direction easily, do the job accurately and add the information they find in a google doc each day. This should take approximately 15-20 hours per week. We will test the campaign for one week at a time - if it proves successful, we will ramp up the campaign and make it an ongoing job, with full time pay. *You MUST have data mining experience because we need speed and efficiency! **Your English must be very good so you can understand our directions. Thanks, Brian
  • Number of freelancers needed: 2
Skills: Data scraping Web scraping
Fixed-Price - Expert ($$$) - Est. Budget: $1,000 - Posted
create a data set that contains all cities and street names in those cities in Libya. In Latin writing (would be nice to have Arabic as well). Your dataset could for example be an excel sheet or a comma seperated file. Find location (address) of post offices in all around Libya and include in your dataset the PO Box numbers at those locations (e.g. PO Box # 2000 - 3500). See if you can also get house numbers on those streets in your data set.
Skills: Data scraping Data Analytics Data Entry Data mining
Hourly - Entry Level ($) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
Dear Developers in Phnom Penh, An interesting approach for the requirements described below could be: Screen Scraping and GUI Automation: Inbound (Data reading): Screen or/and Web-Scrapping Processing Outbound (Data entry): GUI Automation (to control mouse and keyboard moves based on recorded templates) Reference example: https://github.com/UiPath/SDK/wiki/What-is-screen-scraping-and-gui-automation I'm looking for an innovative, creative and motivated Software Developer to support me on an exciting Project. The idea is to convert different types of Inbound-Data (such as Emails, Chats, Images, Attachments, ...) into readable text. Then structure the text, index it and based on content and criteria to process the data and information (e.g. forward the email, send a reminder, save the attachment into a specific path). As follows 2 reference sites which go a little bit in the direction of what I'm planning to do: Web and Data Scrapping to extract Data from the Internet and other Sources: https://en.wikipedia.org/wiki/Web_scraping https://en.wikipedia.org/wiki/Data_scraping And the next step: Computer to understand natural language Text: http://phys.org/news/2013-07-natural-language-texts.html The process starts with automated Data Conversion. Then Data Processing. And at the end to develop the outbound action. There is no fix given development language on this Project. This is subject for research and discussion. Please estimate and propose on the following first use-case example to proof the concept, described in the google doc: https://docs.google.com/document/d/1vqzoxIvY46M0kTl_l1mqt4AHzuRDu3pLFNQqihhnM3Y/edit?usp=sharing I'm located in Phnom Penh and looking for somebody onsite, starting on an flexible hourly base, which will allow you to manage your time even if you already have another job. I'm looking forward to your qualified contact. Please already submit some comments about this task and which technical approach you would have in mind. Thank you and best regards, David
Skills: Data scraping API Development Document Conversion Processing
Fixed-Price - Entry Level ($) - Est. Budget: $100 - Posted
we are on of the biggest It firm of Bangladesh. We need some lead generation work for our sales team to get tuch to new client. We are looking for new freelance who is expert in web research and collecting contact info like first name, last name, email address, phone number and job title. Please apply with your skype id, your rate per 1000 qualified leads and your previous experience in this required field.
Skills: Data scraping Data Entry Internet research Lead generation
Fixed-Price - Intermediate ($$) - Est. Budget: $100 - Posted
We are looking to develop a PROOF OF CONCEPT script to capture the following from all vehicles on an automobile dealer website - { vin: <17 digit vehicle ID number, make: <vehicle make>, model: <vehicle model>, year : <vehicle year>, mileage: <mileage>, website: <website crawled> } OPTIONALLY, When a set of table data (key=value) is available, we'd like a structure containing this data. Example of this is "color", "engine", etc. Use the keys from the website. If they have a space, then "slug" them - ex. Exterior Color | White becomes " {exterior_color : "white} ". Data should be stored as JSON flat-files on our server at Amazon. Example website is ALL vehicles from.... https://porscheatlantaperimeter.com/inventory/ Vehicle Details Example ... https://porscheatlantaperimeter.com/inventory/Porsche+Boxster+Atlanta+Georgia+2015+Carrara+White+Metallic+531744 We believe there are numerous sites that use the same / similar structure. The project will be awarded to someone who can demonstrate an understanding of DOM parsing and regular expressions to achieve this. Phase 1 of project is a Proof of Concept that this website can be scraped. Bonus if script can work with other websites. NOTE: We have listed PHP
Skills: Data scraping Regular Expressions Web scraping
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, 10-30 hrs/week - Posted
PROJECT OVERVIEW: We are currently working on an infographics campaign to promote our business, and we need an expert web researcher who has experience in the area of home safety/monitoring/security. JOB DETAILS: There are 3 parts to this job. Part 1 is... I'm looking for someone who can gather statistics on property crimes and home invasion (State Level) in the US for the last 10 years. Data should include but not limited to the following: • demographics. • time & exact location. • number of accomplice on a single crime committed. • tool(s) used for the crime. • point of entry • how often they commit same crime on the same house/place. • the estimated cost of the properties stolen. • stats on domestic violence committed by burglars. • how many burglary cases are solved each year. -------------------------------------------- Part 2 of this job is: Statistics on the average spending of Americans per state when it comes to home security for the last 10 years. Data to show: • annual income. • number of family members. • how much is spent on home security gadgets/services/mobile application. • have record of intruder reports • have record of burglary reports. And then... Stats on how much they spend (per state) on luxury like: • gadgets • car • home improvement • home accessories • personal items. -------------------------------------------- Part 3 of this job is: Make a list of top 100 safest cities in the US for the last 10 years. The listing should be based on the stats below: population stats on violent crimes stats on property crimes stats on burglary & home invasion number of burglary & home invasion cases solved. The ranking should be according to the stats of the crimes recorded versus the number of cases solved for the last 10 years. -------------------------------------------- STEPS TO GET THIS DONE: 1. Unless you know the “go-to” places on the web, you’d start your search by using the following search strings: “crime statistics by _____” enter either the city, zip code, state, street address and year. “crime statistics services” there are services that offer free data like http://www.neighborhoodscout.com/neighborhoods/crime-rates/top100safest/. You already have some of the data ready to go. “NameofCrime clearance rate in Year” i.e. “property crime clearance rate in 2005” The key here is to think of possible keyword combination to help you find the information you need. 2. Next you’ll have to record everything on a spreadsheet. To be considered for this job, I expect you know your way creating spreadsheets. You will then fill in the data into a spreadsheet we will provide you with. NOTE: The first two part of this job is in the US state level, while the third one is on the city level. You may notice there are other websites offering data on the 100 safest city out there. Our goal is to top those and offer a much bigger data of information to our readers. -------------------------------------------- A couple things. When you apply mention "I have the expertise" in the first paragraph of your cover letter. Also, give me an overview of your process and how soon can you deliver the first set of data. I’m looking for someone who can work on this right away as we may have future projects together if things go well. See you on the other side! Rock on!
Skills: Data scraping Big Data Data Entry Data mining
Fixed-Price - Intermediate ($$) - Est. Budget: $250 - Posted
I need help in scrapping university websites sites for email address. This is a low weight work. The code can be written in any language, preferably in Ruby. I will need a csv file in the format, i specify. I will also specify the sites and techniques to scrape as i have done this before. There are close to 30 sites, you will be needing to scrape from. I will give you the university name and you will need to get names from university facebook group or use common names that i can provide. Find the student directory page for that university and use that for searching and fetching the email and other meta information from the results
  • Number of freelancers needed: 2
Skills: Data scraping HTML JavaScript Web Crawler
Looking for the Team App?
Download the New Upwork Team App
Fixed Price Budget - ${{ job.amount.amount | number:0 }} to ${{ job.maxAmount.amount | number:0 }} Fixed-Price - Est. Budget: ${{ job.amount.amount | number:0 }} Open to Suggestion Hourly - Est. Time: {{ [job.duration, job.engagement].join(', ') }} - Posted
Skills: {{ skill.prettyName }}
Looking for the Team App?
Download the New Upwork Team App