You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Scrapy Framework Jobs

22 were found based on your criteria

show all
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only

Fixed-Price - Est. Budget: $ 100 Posted
It is necessary to create Web Crawler based on Scrapy Framework to scrap kickstarter site and store data in MongoDB. Detailed specification will be provided.

Hourly - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
I have the scrappy framework that has been installed. I'm looking for an expert in scrapy spider configuration to mine multiple sources.

Hourly - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
We need an experienced Scrapy developer to help fix some bugs, and buildout small features for an existing project. We have an existing pipeline and need to clean up some data before getting saved to mysql db. We would also like to scrap some extra fields as well.

Hourly - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
Overview I put a brief up a few weeks ago, but due to the requirements significantly evolving – have had to rewrite and repost this. To give a bit of background on my client – they have a number of e-commerce websites and the markets they work in are highly competitive. As such, they need to ensure their pricing is always as competitive as possible. Their current process involves manually reviewing their competitors by browsing Google Shopping on an ad-hoc basis, searching each of their products one by one. Any products sold cheaper by a competitor is noted down and changed on their own website. It is a very slow and rather unsustainable process, considering they want to grow the business. To support their goal of always wanting to have the most competitive pricing they need to automate the way in which they obtain this information. Requirements: Whilst the client has a number of websites that they eventually will want the tool to support, for the requirements...

Fixed-Price - Est. Budget: $ 10 Posted
need someone to fix a python script made of scrapy framework with 2 function of scraping all the information from viralnova.com and the other function is posting this info to protatype.com. the script was working well but it somehow broke down

Fixed-Price - Est. Budget: $ 500 Posted
I need 50 directory scrapping scripts built in Scrapy and python in the next 30-45 days. You would need to begin work relatively soon. GUI and current interface for 110 scrappers currently exists. You would be taking on an existing project and adding 50 scrappers. Sometimes you will be fixing existing scrappers. Scrappers are hosted on my server and everything would be run from an online GUI. Please apply only if you have extensive Scrapy experience and python knowledge. If I like your application, I will send you a video with full details of this project.

Hourly - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
We are seeking python developers to code spiders for crawling e-commerce websites. See attached document for the details on the work. There will be a test job with a payment of 25 USD (15 USD for partial but acceptable functionality). Example code will also be provided. Our usual payment is for each spider, but you can bill it hourly on oDesk.

Fixed-Price - Est. Budget: $ 300 Posted
Guy Gunter Home is seeking an experienced python programmer to scrape all of the product data from the website for Grohe (http://www.grohe.com/us). Please read prerequisites carefully as they are a minimum requirement. Full compensation depends on the full completeness of the data scraped, which will take a moderate to above average skill level to acquire. The output product data must follow fairly stringent standards and you will be given a tool to check the quality of your work prior to submitting for payment. We will also check the data using the same tool and if it is up to standards, we will pay the full amount for the scraper. To be clear, it is the scraper we want, but we need to validate that the data it produces fits within our guidelines. Prerequisites: - Scrapy. All of our scrapers must be done in Scrapy. - Python. PEP-8 or bust. - Git - The ability to check one's own work Notes: - Grohe uses a custom web framework made by a German company. All of the products...