You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Data Scraping Jobs

236 were found based on your criteria

show all
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only
Fixed-Price - Est. Budget: $ 75 Posted
I need to collect the names of all professors of education in the USA. (Professors that train others to be teachers) A list of edu professors at a given school can be found on RateMyProfessor.com this way: - http://www.ratemyprofessors.com/search.jsp - > Profs - Search by School - Enter school name and select "Education" department http://cl.ly/image/0t263R451Q3H - Search - "Load More" until all are loaded - Pull all names and scores Here is a list of all schools: https://docs.google.com/spreadsheets/d/10dggcFSneSeAbTKyhGTEM7tfK2MJLzrPGpTuuN-VYr8/edit#gid=0 The job is to write a scrapy scraper that will: - Attempt to search RateMyProfessors.com for every school in the list - Skip any schools that RMP.com does not recognize, or that returns zero results for the Education department - Return a list of all the teachers successfully found at each school Returned list of teachers should have the following fields: - professorName - schoolName - numberOfReviews...
Hourly - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
We are soliciting for the DESIGN of a META SEARCH engine and dynamic reporting system which will search The VERIS COMMUNITY data and Framework (Public Domain) to produce search results on any search topic; allow users to add content back in to our data repository in the same formats and data input fields as VERIS; and allow for adHoc reporting. In essence we are asking for the design of a dynamic Security Breach application and reporting system. The winning design will have a search, membership sign up, reporting, data analytics dashboard and community feel to the product which will be provisioned as a website and web application with mobile extensions (ability to be seen on mobile devices) and alerting capabilities for any new data that is meaningful. The winning design will also take in to account the ability to search additional data sources both locally stored in the application/website and from third party resources (a service integration layer to connect to other data repositories). Ideas,...
Hourly - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
We are soliciting for the DESIGN of a META SEARCH engine which will search multiple data HETEROGENOUS repositories of genomic data across multiple locations that are web accessible and be able to search future repositories that area publicly accessible in Google's cloud computing environment and in AWS. The design can be either realtime and not store any data in our own repository (ex. Kayak search) or can electronically transfer data in batches to a central cloud repository and then be searchable. In fact, we expect a description of the tradeoffs between both in any design proposal. The list of sources for the INITIAL set of data can be found: https://gds.nih.gov/02dr2.html Concepts like downloading the repositories or transferring these repositories to a single central repository are acceptable but only if Practical and LEGAL as part of the terms and conditions of the sources. A realtime meta search that only indexes the repositories is preferred. The UI design and UX is not a...
Hourly - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
I need to compile a database containing box office revenues, actors, and synopses from multiple sources and store it in a SQL database. Looking for a programmer to: - Design the schema - Scrape the two to three sources - Provide all source code on github Ideally to program it in a scripting language like Python, Rails, or R End result is a database that is updated once a week and is queried by a separate Rails Application. *Also looking for ML and RoR devs in conjunction with this job post.
Hourly - Est. Time: 3 to 6 months, 10-30 hrs/week - Posted
This job requires adding backend data to a web form for an education based website and also web research on educational institutions in the US. You will be asked to research for High school summer programs on U.S college websites and add relevant information to a back end web based form. The job will also require to compile this information into existing Google docs if needed. Excellent written and spoken English skills Excellent typing skills. Team player, fast learner and confident in his or her abilities.
Hourly - Est. Time: 3 to 6 months, 10-30 hrs/week - Posted
I need someone who will be all over my spreadsheets. I have data that needs to be input into spreadsheets a few times a week. The job will require entering data into spreadsheets and then giving a report of the changes made (will be simple). The second duty will be lead generation. Someone from my team will send you a company and I need you to produce contact information. You can also generate leads via Linkedin, google searches, etc.. once you know what i am looking for.
Hourly - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
Overview: SEO Quotient (www.SEOQ.com) is a website that seeks to answer the big question ... why does his website rank higher than mine? Requirements: To do this, I need to crawl a website looking for specific information and then bring that back and present in a meaningful way. I would be looking for things like keywords in the title tag, what CMS is being used, how fast or slow is the site, does it pass Google's mobile friendly test, etc. Skills: The front-end of the system was built in WordPress, PHP, Bootstrap, HTML, CSS and the back-end of the system was built in Python, Django, PHP, AngularJS, etc. Some folks who have done work like this in the past thing that PERL is the method to use. I'm sure there are plenty of ways to crawl and get the data but, ultimately, it needs to live in our Python/Django/MySQL back-end. Probably, you will also need to know Git to help contribute to the repository. Knowing how to get data to/from APIs and perhaps how to build APIs...
Fixed-Price - Est. Budget: $ 40 Posted
Expert Data Scraper $40/h Title: Expert Data Scraper $40/h Criteria: We only work with the absolute best We are looking for an Expert Data Scraper We will have more advanced tasks in Data Scraping if results are 10/10 Work instructions: 1. Work with virtual team in Asana 2. Complete every task in project 10/10 3. From scraping first data first to uploading finished project Hiring process: 1. Answers to screening questions 2. Complete interview process demo 3. Complete $40/1h fixed price project 4 Work as long term hire for $40/h for 40h/week Comment: We will look at all applications Hope you are okay Thanks a lot :))
Hourly - Est. Time: 1 to 3 months, Less than 10 hrs/week - Posted
We are looking for a contractor with superior knowledge and experience on Import.io and data scraping. Experience with Wordpress is also desirable. What we are trying to achieve is to use Import.io to scrap data from various sources. These scraped data, in turn, will feed into google sheets. These google sheets will be used as source for various articles/posts on our website.