We are looking for someone to collect a large amount of data fields (e.g. name / title (shorten titles), description, price, discount price, categories, and others) from different websites and / or APIs.
The scrapers must be coded in Scrapy (Python >= 3.5) and PostgreSQL.
We need the scrapers to collect the highest number of pages / results possible, without overwhelming anyone or raising concern. Scrapers must be prepared to run 24 / 7, and, in case they need to be restarted (for any reason) we want to avoid re-scrapping everything again from that day / source.
The successful candidate must collect the data with speed and be highly accurate.
There are several sources to be scraped, the fixed price is per source.
You will only be paid once we test the output on our end.
Data Scrapping, Web Extraction, Web Scraping
February 16, 2018
I am willing to pay higher rates for the most experienced freelancers