I am looking to have a well done, fully commented script put together as an example for the latest asyncio sockets technology, as well as a basic example for MongoDB. Using a list of 20,000+ URLs I can provide, I would like the script to be able to spawn off as many concurrent requests as I desire and fetch the result from each URL and store it into a MongoDB table. Essentially, a basic Http/Https client needs to be created in order to fetch and handle the requests, as I would like to send raw headers and receive the raw response from the server. The client should be able to handle any errors or timeouts (DNS that cannot resolve, or requests that simply time out) and be able to repeat the process again up to 3 times before determining the URL as bad and updating the database as such. I would also like the client to have the option of using a proxy for the requests (which I can also provide if needed). The purpose of this is to be as fast and lightweight as possible, so the lowest level objects possible should be used (Python standard library only).
I will be using the script as a reference tool to building something a little different, so I could also pay you for your time, if interested, to answer any questions I may have on modifying the logic.