You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Scrapy Framework Jobs

18 were found based on your criteria {{ paging.total | number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("hourly") | number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("fixed") | number:0}})
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only
Looking for the Team App?
Download the New Upwork Team App
Fixed-Price - Intermediate ($$) - Est. Budget: $500 - Posted
Hi I need a web crawling script, preferably written under scrapy. Here is a similar system: http://www.biznesscrm.com/ Here is what is does: 1. User put in keywords to search in Google 2. User drop down and select region (google region) - http://www.genealogyintime.com/GenealogyResources/Articles/genealogy_guide_to_google_country_search_engines_page3.html 3. After it search it will further search the contact information of all the SMEs information it returned. I will need the contact information, email 4. if it is a contact form, it will store the url and the user can one click and do submission on the the contact form on some preset messages 5. The user can store the saved result into the list Please register an account on the biznesscrm and play with the system
Skills: Scrapy Python
Fixed-Price - Entry Level ($) - Est. Budget: $100 - Posted
Hi! I need info on each user from a vbulletin forum The user urls look like http://domain.com/member.php?u=732922 There are about 300K users and you could just crawl through each page by changing the ID on the end of that url. Only one issue is that you'll need to be logged in as a user (but accounts are free) I've attached an image, they are of the same page, but have slightly different data based on what people added. Here is a list of what I'm looking for - 1) User Name 2) Location 3) Occupation 5) Total Posts 6) Join Date 7) Posts per day 8) Last Activity 9) Date of Birth 10) Email (This is also a bit tricky, because the site doesn't list it but instead includes it as a vcard to be downloaded. The vcard is just a text file that includes the email. if it's too hard to extract, you could just include a zip with all the vcards and I can extract myself) Let me know what the cost would be for this. Thanks!
Skills: Scrapy Web scraping
Hourly - Intermediate ($$) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
i will provide company name and potential job titles to look for per company you will find exact name, job title add to columns in existing spreadsheet example of what i will provide: company name: google company domain: google.com title(s) to look for: ceo you will return in two columns (column A = name1, column b = title1): name, title: sundar pichai, ceo this must be done in bulk and automatically as new rows are added to sheet if i have a list of 100 company and/or domains you should be able to return accurate results in minutes or seconds for all results this is not a manual job you can create scraper, crawler, api whatever i just need fast, accurate, consistent results
Skills: Scrapy Web Crawling Data scraping Web Crawler
Hourly - Entry Level ($) - Est. Time: Less than 1 week, Less than 10 hrs/week - Posted
I'm looking for an experienced python/scrapy developer (or whatever library or framework you think is suitable) to create robust scripts to periodically crawl specific men's clothing retailer sites to read in product data including associated images and clothing measurements. The extracted data will need to be written to either json and image files or to a MongoDB database. The initial job will be to create a scraper for 1 or 2 sites. If all goes well I'll be looking to implement similar scrapers across potentially 50 sites and there will be a need for ongoing support and maintenance. Previous experience creating web scrapers is a must. The expectation is that the resulting code will be robust, well structured and maintainable. I'll provide a json model template that the data will need to be written to along with the field definitions and specs. This is my first attempt with this so we may need to experiment a little. Suggestions and feedback based on your scraping experience will be welcome. I've indicated less than a week for this job. Let me know if you think it's likely to take longer - I'm not really sure.
  • Number of freelancers needed: 3
Skills: Scrapy
Fixed-Price - Intermediate ($$) - Est. Budget: $500 - Posted
Crawl website for content extraction and provide extracted content in various format such as Microsoft Excel (.xls), XML, Microsoft Access (.mdb), SQL, etc. - Will be collecting data such as Real Estate Properties - Must have experience with Scrapy and Scrapinghub Platform - Setup scraped files and images to FTP server - Salesforce API Integration and Resolving Formatting Issues
Skills: Scrapy Data mining Data scraping Python
Fixed-Price - Intermediate ($$) - Est. Budget: $500 - Posted
Looking for someone expert in scraping data from variation website.. and save data in mysql / csv . Script has to be python or php . If python, it should work with on linux server with lamp php website. If you are really good , I dont mind you offering full time job as I need 100s of scrapping stuff needed over next 3 months .. this job is for 6 websites .. but I might need some other small scrappers before I give big project please answer these 1. write 'ddonk' before application 2. let me know if you prefer php or python 3. mention what website you have scrapped ? google, linkedin, amazon, yellow pages ? 4. show me list to any web application that does scraping if you have build any 5. do you have full time job and part time freelancer ? or you are full time freelancer ?
  • Number of freelancers needed: 3
Skills: Scrapy Data mining Data scraping Django
Fixed-Price - Intermediate ($$) - Est. Budget: $1,500 - Posted
I'd like to scrape/extract some of the store locations for the companies/webpage interfaces provided below: TWC - http://www.timewarnercable.com/en/support/twc-stores.html Charter - https://www.charter.com/browse/content/store-locations-adp Comcast - http://customer.xfinity.com/service-center-locations Cox - http://www.cox.com/aboutus/contact-us/cox-centers.cox Mediacom - https://mediacomcable.com/site/about_local.html Ideally we'd gather the following for each location: Street address City, ST Zip code Phone Location type - For TWC, the divs below "Location Options" - For Charter, all the info under the phone number (it's not labeled, but lives under <p class="details"> - For Comcast, it's next to the number in an H3. - For Mediacom it's in in the H3. This could be tricky because it sometimes requires you to choose the closest city when you put in a zip. Whatever we can gather, though. - Looking at Cox, it might not be possible to scrape. It's in a Google map. Perhaps look into this and let me know?
Skills: Scrapy Web scraping
Fixed-Price - Intermediate ($$) - Est. Budget: $60 - Posted
Hi there! I'd like you to scrape https://www.epcregister.com/reportSearchAddressByPostcode.html using the 4,000 postcodes in the attached file. For each postcode, I'd like you to scrape each address, and download the PDF reports for each one. When finished, I'd like you to send me a CSV file with a row for each address, related postcode, and the filename of the PDF, plus all the PDFs themselves in a zipped directory. I don't know exactly how many addresses/PDFs this will be - maybe 30,000. I'd also like to see your code. Scrapy would be ideal, but any Python scraper is fine. Thanks! Please get in touch with any questions.
Skills: Scrapy Python Web scraping
Looking for the Team App?
Download the New Upwork Team App
Fixed Price Budget - ${{ job.amount.amount | number:0 }} to ${{ job.maxAmount.amount | number:0 }} Fixed-Price - Est. Budget: ${{ job.amount.amount | number:0 }} Open to Suggestion Hourly - Est. Time: {{ [job.duration, job.engagement].join(', ') }} - Posted
Skills: {{ skill.prettyName }}
Looking for the Team App?
Download the New Upwork Team App