You've landed at the right place. oDesk is now Upwork. Learn about the new platform.

Scrapy Framework Jobs

9 were found based on your criteria

show all
show all
only
only
only
show all
only
only
only
only
only
show all
only
only
only

Fixed-Price - Est. Budget: $ 25 Posted
I need to program a cron job in python. http://www.avis.com/car-rental/avisHome/home.ac Giving 5 initial variables (dates and time for pick-up and return, and pick-up location), and leaving all the other default values in the form, I neet to retrieve a list with all available rental cars (category, model, price 'pay now, and price 'pay later')

Fixed-Price - Est. Budget: $ 300 Posted
### Expert Level ### Please reply with real world examples ?? If you have experience with a bloom filter please let me know. Hi I have a version of the application already built but isn't performing as I had hoped. I need a developer to help me improve the application in terms of performance and accuracy. Below are what I want to happen so be clear in your replies that you can perform this type of work. - Web scraping sort of works, but really needs to become multi-threaded and a lot more robust (currently breaks lots), the basic stack is below but I'm not adverse to other technologies being used. - The crawler aims to collect 100's millions of rows of data from numerous content networks, so the crawler needs to be able to manage that number of rows and the complications that bring to the table. - Post content to social networks, current app doesn't post in the correct format (weird categories) so that needs to be fixed. Basic headlines - Create multi-threaded...

Fixed-Price - Est. Budget: $ 10 Posted
need someone to fix a python script made of scrapy framework with 2 function of scraping all the information from viralnova.com and the other function is posting this info to protatype.com. the script was working well but it somehow broke down

Fixed-Price - Est. Budget: $ 500 Posted
I need 50 directory scrapping scripts built in Scrapy and python in the next 30-45 days. You would need to begin work relatively soon. GUI and current interface for 110 scrappers currently exists. You would be taking on an existing project and adding 50 scrappers. Sometimes you will be fixing existing scrappers. Scrappers are hosted on my server and everything would be run from an online GUI. Please apply only if you have extensive Scrapy experience and python knowledge. If I like your application, I will send you a video with full details of this project.

Fixed-Price - Est. Budget: $ 80 Posted
I need to scrape newly added business listings from www.gumtree.com.au on a daily basis. The daily data must be provided to me in a CSV/Excel file. Data must be unique and must have phone numbers. Please go through the attachment of sample output structure. The job has a potential to go on for several years provided quality and consistency is maintained. Note: Apply only if you have done this previously and have a credible solution to overcome the obvious challenges of blocking as well as AU IP restrictions. if you wish to be considered, I would suggest trying out for a 2-3 days by giving me daily sample records. This will give me idea of your capability. Who ever bids $80 or below will be hired immediately This is a long term job and if succeeded and if cost is reasonable it will go on for years. Regards, Wantro

Fixed-Price - Est. Budget: $ 150 Posted
Scrapy data collection for 5 sites 1 JSON file as output per website Www.bigw.com.au Www.kogan.ccom/au Www.Myer.com Www.target.com.au Www.oo.com.au Items to collect products Product URL Price Product image URL Product desc Sku Category dESC Product sizes Sale Price Normal Price

Fixed-Price - Est. Budget: $ 40 Posted
We need someone who can scrape a website for us and provide the script for it. Candidate MUST: - have great English communication skills - deliver within 3 days - provide clean code / clean and meaningful mysql database or .csv - use http://scrapy.org/ or similar tool or convince us there is a better method to scrape You will: - Scrape one or more sites for us and provide all the data and automated script to duplicate the work. - Script must work under any and all conditions and should take care of delays, IP suspensions, etc... - comment your script properly and create very clean code - assign meaningful variable names