Data Scraping Jobs

351 were found based on your criteria {{ paging.total|number:0 }} were found based on your criteria

show all
  • Hourly ({{ jobTypeController.getFacetCount("0")|number:0}})
  • Fixed Price ({{ jobTypeController.getFacetCount("1")|number:0}})
Hourly - Intermediate ($$) - Est. Time: Less than 1 month, Less than 10 hrs/week - Posted
Not entirely sure how you would go about this, we're open to ideas. Also would need someone with blasting software for texts / emails / voicemail drops once we get the data. Please respond with what kind of results you think you can achieve, and a general overview of how you plan on accomplishing it? Thanks!
Skills: Data scraping Data mining Email Marketing Lead generation
Fixed-Price - Intermediate ($$) - Est. Budget: $50 - Posted
GOAL: Podcast Subscribers, downloaders and reviewers in iTunes SKILLS: Mechanical turk api, web hooks, HTML, CSS (and other required knowledge for HITs on Amazon Turk ( Mturk ) OUTCOME: Someone downloads all 5 podcast episodes to their computer, subscribes to the podcast and leaves either a 5 star rating or custom written review (written reviews are the highest) The mechanical turk ( mturk ) setup will need to allow to force the following: - MUST already have an iTunes account - MUST be people located IN the United States - MUST provide a screenshot of the 5 downloaded podcast files - MUST provide a screenshot of their 5-star review on the iTunes application with the subscription selected (to suggest they have already subscribed) IMPORTANT NOTES: - We will need to be able to limit the requests between 10 to 100 - This must automatically populate a document (or web page), your decision -
Skills: Data scraping Mechanical Turk API Microsoft Excel
Hourly - Expert ($$$) - Est. Time: Less than 1 week, 10-30 hrs/week - Posted
Looking for someone that is SSH wise that can automate a SSH script to pull report data from several Ubiquiti M5 Rocket and Rocket AC access points and post them to a website. It needs to be either automated and automatic, or able to run one command to get the results. We would like to see data that gathered every 24 hours in one place for all access points. This will help us keep track of how many connections/clients are on each access point more easily.
Skills: Data scraping HTML JavaScript Microsoft Excel
Fixed-Price - Intermediate ($$) - Est. Budget: $50 - Posted
Hello, I need some code that will help me mine data from a website. See the attached powerpoint for a sample of what needs to be done. I could run the code from my computer, does not need to be automated. I would provide a table with say 100 base data points. The code would enter the first line of base data into an input box, hit enter and then scrape the resulting value that is returned by the website and store it back in the table and then move to the next record in the table and repeat. There are 30 categories the code would repeat this logic in each category. I could manually select the category and then run the code if that makes it easier. See sample at this link https://dl.dropboxusercontent.com/u/5787670/Scraper.ppt
Skills: Data scraping Web scraping
Hourly - Entry Level ($) - Est. Time: Less than 1 week, Less than 10 hrs/week - Posted
Looking for someone to: visit popular job sites cities i provide use keywords I provide and get all the individual phone numbers from each individual ad in the search results (thousands) Automated or manual, matters not, lowest rate wins... Looking for a thousand records, quick turnaround time, and basically an audition for on-going work...
Skills: Data scraping Data Entry Data mining Internet research
Fixed-Price - Intermediate ($$) - Est. Budget: $75 - Posted
I am looking for someone to scrape text from links and h1s for specific pages states -> all on one page (text) cities -> of each state (text) faclities -> of each city (text) h1s -> of each school (text) you should be familiar with webscraper.io or import.io or something similar. This is not a manual labor job but a script and CLICK AND GO job. You should be able to setup the script based off the same actions since they all follow the same structure.. If you write a script to gather one exact value state -> all the cities in state -> all the facilities in cities -> h1 to facility pages ^ you should be able to duplicate this for all states and run it based off a clean html formula provided by the page we are scraping. Here is my rules for scrape data States - all the states Cities - all the cities in those states Facilities - all the facilities in those cities & states H1 for Facility page - all the page titles to the facilities in city of state
Skills: Data scraping JavaScript Web scraping
Hourly - Intermediate ($$) - Est. Time: More than 6 months, Less than 10 hrs/week - Posted
We are seeking for an individual who is excellent at mining data in the Luxury Market as well as the Luxury travel Industry must be high net worth individuals in select zipcodes. To generate a constant supply of personal contacts :# Name # email # Addresses scraped from zip codes, instagram, twitter, Linkedin, Orgnaizations, websites etc...... Luxury market in The USA Only starting with the East Coast. This is an ongoing job We will need you to deliver a constant supply of emails on a monthly basis. We will need some samples to test.
Skills: Data scraping Data mining Web scraping
Fixed-Price - Intermediate ($$) - Est. Budget: $80 - Posted
I need someone to do some copy and paste work for a pdf file that needs to be converted into excel. I also need someone with great experience in internet research on finding names , businesses and contact infos and to be paste in excel/spreadsheet or any alternatives i will pay extra if you can email those leads with the template i will provide.this is going to be a long term work if you show results fast
Skills: Data scraping Administrative Support Data Entry Internet research
Fixed-Price - Expert ($$$) - Est. Budget: $300 - Posted
Looking for an experienced scraper to scrape data from an e-commerce website. Extracted data should be collected into a Google Spreadsheet document. In order to make selection, please provide hourly rate and estimated number of hours to complete the job. Please see below detailed job description. ---------------------------------------------------- > Job Description: Scrape email addresses and other data from Shopify Commerce University website (all different URLs) and collect them into a Google Spreadsheet. > Requirements: 1) Run all URLs like the following: https://ecommerce.shopify.com/users/456777, where every number (in this case 456777) is a different user. 2) For every URL, numbers from 000000 to 666666, extract -name, email address, homepage (website) 3) Collect all the data into a Google Spreadsheet: - one row for each different URL - titles of the columns must be: Name, Email address, Homepage (website) 4) Lastly, check on the websites found if there are links to their Instagram accounts. And if YES, add in the spreadsheet: username of the Instagram account & number of followers > Milestones: -1st milestone: deliver the extracted first 100 URLs, in order to verify the correct understanding and execution of the job -2nd and last milestone: deliver all the scraped data. If everything ok, payment will be settled immediately.
Skills: Data scraping Google Spreadsheets Web scraping
Fixed-Price - Intermediate ($$) - Est. Budget: $100 - Posted
I am looking to scrape contact information for sneaker boutique stores in the United States. Please see the attached excel sheet to see how I'd like the data organized. To be eligible for this job, please download the attached excel and add 5 entries to the list, and return to me. Data I need: Business Name Business Address Business Website (URL) Business Phone Business Email
Skills: Data scraping Web scraping