Web Scraping Jobs

375 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Fixed-Price - Entry Level ($) - Est. Budget: $20 - Posted
I need to scrape a number of content needed from PDFs. Ideally someone who has training in the education sector (ideal, but not required), and fluent in Filipino. This is a one-time project but you can split it into weekly submissions, as long as you meet the quota.
Skills: Web scraping Data Entry Data scraping
Fixed-Price - Entry Level ($) - Est. Budget: $20 - Posted
Hi I need someone to find and fill out information for 6 fields on the excel I have uploaded. We require: Industry Company Name Phone First Name Surname Role Industry- 44 Industries have been listed to help guide you and in red there are other examples of industries are listed The contact person and their role I'm looking at will be either e commerce Manager, Online Manager or Marketing Manager. Thanks
Skills: Web scraping Data Entry Research
Fixed-Price - Intermediate ($$) - Est. Budget: $300 - Posted
I need you to develop some software for me. I would like this software to be developed for Windows or Mac. Hi, I want an adidas.com software made. The software would add the shoes to cart with this link: (Backdoor link)> http://www.adidas.com/on/demandware.store/Sites-adidas-US-Site/en_US/Cart-MiniAddProduct?layer=Add+To+Bag+overlay&pid=S77510_600&Quantity=1...... (S77510 is the sku for the product)^ (600 is the size code)^ (Message me for the full link and specifics) The software would have to harvest the reCaptcha token automatically from adidas server or using a third party service such as 2captcha to solve the captchas and get the tokens. Or you can use the site key to get the captcha (which I have) Once you solve the captcha right click and inspect element, go to network and right click on "useverify" and copy the response. That will get you the captcha token. (I need this automated) Once the software has the captcha token it then pastes it into the Backdoor link above^^ IF YOU UNDERSTAND THIS PROCESS SO FAR PLEASE MESSAGE ME AND PUT A QUOTE DOWN. I'm not accepting any developers with under 100 feedbacks and/or that doesn't want to do the transaction directly through freelancer.com. Please message me and I can explain everything in full detail over the phone or email or freelancer messaging system. Here's what the software needs to do; -Add the shoes through the Backdoor link -able to change the pid and size sku within the bot -multithreaded -proxy support Optionally: -automatically checkout and complete the order (checkout with credit card and PayPal) LASTELY, I want highly motivated and productive developers THAT ARE NOT WORKING ON MULTIPLE PROJECTS AT ONCE. I want my project in priority and I appreciate it. If you are not serious about completing this then please move along. I AM LOOKING TO HAVE A LONG TERM BUSINESS RELATIONSHIP WITH YOU AND I WILL DO EVERYTHING IN MY POWER TO HAVE THE BEST WORKING RELATIONSHIP. Thank you
Skills: Web scraping C C++ HTML
Fixed-Price - Expert ($$$) - Est. Budget: $100 - Posted
I want to hire a Python/Scrapy expert to code me and teach me how to use a Scrapy bot that does the following. I want to be able to have Scrapy read a text file with a seed list of around 100k urls, have Scrapy visit each URL, and extract all external URLs (URLs of Other Sites) found on each of those Seed URLs and export the results to a separate text file. Scrapy should only visit the URLs in the text file, not spider out and follow any other URL. I want to be able to have Scrapy work as fast as possible, I don't need proxy support, I want to be able to export domains that give 403 errors to a separate text file. I also want to be informed how I could scale my link extraction for more speed and to be able to parse millions of URLs per day.
Skills: Web scraping Web Crawling Python Scrapy
Hourly - Expert ($$$) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
Need a small utility to monitor public filings every [x] minutes for new filings, then to find the last [3] filings of the same type posted by a related entity. To do this task, the utility will have to query sec.gov periodically, and when new filing is found, will need to take a number of subsequent steps, where each step generally involves parsing html tables and text to extract pieces of data, then using that data to build a new query, then submitting the query (and repeating). For each new filing found, the utility should just create a file that lists the URL of the new filing followed by the URLs of the prior related filings.
Skills: Web scraping
Fixed-Price - Intermediate ($$) - Est. Budget: $500 - Posted
I need some data researcher to search the internet and prepare the list of tutoring centers and private tutors for major US cities like New York, New Jersey, Austin, Houston, Los Angeles, etc. Typical data I would need are: Name Email Website Contact Number City I am looking for at least 1000 data each for ten major cities.
Skills: Web scraping Data mining Data scraping Google search
Hourly - Entry Level ($) - Est. Time: 1 to 3 months, 30+ hrs/week - Posted
Hi! I'm looking for someone familiar with Python, Django, Celery, Big Data, Postgres. We are currently working on a SaaS product and need to improve and enhance a given code. You'll be dealing with a SaaS product (not in production nor set up on a live server yet) that is for the purpose of lead generation. Company and contact information shall be automatically obtained by the system and give the user the possibility to find prospective customers. The following things need to be done and integrated in the existing code: - improve / extend features of existing code that includes crawler/scraper - check and debug celery worker / tasks for working properly again (maybe seperate different tasks to different workers; check why saving problems occur) - improve code and make it more efficient and faster; consider scalability - improve Regex - if multiple addresses have been found for a company, the one with highest identity factor should be choosen and shown as main address - complete sites and all related subpages from websites should be downloaded and stored in DB (corporate website e.g. 1)Home 2) About Us 3) News 4) Team 5) Customers 6) Products 6a) Product A 6b) Product B ... etc ... - subsquently all the important text (about us/home/product texts) shall be extracted and saved in the main database, directly associated with the company. - optional: possibly being able to include front-end
Skills: Web scraping Web Crawling Data mining Data scraping
Fixed-Price - Entry Level ($) - Est. Budget: $100 - Posted
I need someone to write a script that will 1) navigate to a web page, 2) select a class from a drop down menu, 3) insert a date from a text list in another field, 4) click a "Submit Search' button, 5) navigate to a new window containing the search results, 6) save the results page as an HTML file, 7) click a link in each of the multiple row of the results page, 8) save each new results page as an HTML file, 9) move on to the next row and repeat process until last row on the page, 10) check for additional pages listed in comb, 11) close results window, 12) return to search page, verify same class is selected, insert new date, press "Submit Search" and repeat process until done Chosen applicant will write necessary script and perform web scraping operation for approximately 115 dates. Complete specifications will be provided to selected qualified applicants.
Skills: Web scraping Data scraping Python
Fixed-Price - Intermediate ($$) - Est. Budget: $60 - Posted
Thank you. I would like to start with the following task: TASK: I need to find the best car repair shop or car mechanic in either Los Angeles County, California or Orange County, California, to Repair a Catalytic Converter & a Front Heather Pipe, or to Replace a Catalytic Converter & a Front Heather Pipe. To determine which car repair shop or car mechanic is the "best", use these three (3) criteria: 1. Price (Lowest Price - Cheapest Price), 2. Quality (Trustworthiness - are lots of Previous Customers Very Satisfied with their Work?), & 3. Speed (Fast - must Complete Full Repair/Replacement in 1 day assuming all Necessary Parts Ordered). Attached is a document explicitly explaining how to measure these 3 criteria, and negotiation tactics to reduce the price as much as possible. Be creative with which sources you use to find these repair shops and mechanics. You are the Web Research Expert. As an example, you can search the yellowpages.com for "Car Repair & Service" near Los Angeles, CA or Anaheim, CA, then call different shops with 5-star ratings, 4.5-star ratings, & 4-star ratings, and find out how much they charge and generate a custom price. The repair shop or mechanic can be anywhere in Los Angeles County, CA or Orange County, CA. Look at a map on the internet to see where these county boundary lines exist, so you can know which cities are in there, so you can know where you can research. Try lots of different cities to get an idea of which ones are going to provide the lowest prices with consistently great quality & speed. When you find a qualified mechanic or repair shop, please enter them into a spreadsheet. Include 1) name, 2) address, 3) price, 4) quality rating, 5) description of how you found this price / how the phone discussion went (this can be a simple description of a few words unless there were complications), 6) description of their recent negative reviews/ratings that you found, & 7) any other comments you may have. REMEMBER THEY ARE ONLY QUALIFIED IF THEY CAN COMPLETE THE REPAIR/REPLACEMENT IN ONE DAY'S TIME. I need 10-20 mechanics / repair shops that are qualified, with prices under $1,700 for the job. This assignment must be completed within 10 Hours. There are bonuses, so aim for more than 10-20 mechanics/shops if you start to get close to the BONUS levels. Please call me beforehand for an interview. P.S. There are bonuses of $100 up to $250 if you perform above and beyond the call of duty and find a truly amazing and unbeatable deal. Details are in the attached worksheet.
Skills: Web scraping Administrative Support Data Entry Data mining