1. Scraper ability to read a custom input CSV where each row equals an individual query (Example: A1 would have "music venues near los angeles" and A2 would have "mechanics in knoxville". Note that the scraper must maintain the Booleon logic of "in", "near", "los+angeles", etc)
2. Scraper uses input CSV list to query "Google Maps". Logic would need to be in place to go through the pages of results that Google Maps offers.
3. For each row in the input query CSV it will output a new CSV for the output data in a directory of my choosing. (So when it moves to the query in row 2, it will create a brand new output CSV titled as the name of that query). Perhaps logic for timing out after 5 minutes for each query row, then it moves on to the next query row in the input CSV
4. In each output CSV, there will be columns as outlined with the appropriate data in them, if it was available during the query
A. Result Number (basically, if it was the first, second, third result, etc)
B. Business Name
C. Email Address
D. Phone Number
Please let me know if you have any questions!