I have a favorite website that I want to save all the webpages under the directory of domain.com/scripts. Saving the each page is achievable through a software, so what I want is a list of the URLs. An ordinary user can obtain "a few hundreds of" URLs through the Google search, like this site:domain.com inurl:scripts. But this is not what I want to do for extracting the data. It is inefficient and unstable, Google (probably) might ask you some robot proof tests like Google ReCaptcha. So if they have, or any other third party has, it's more appreciated for me to get the list of URLs for a website. So, Is there a way ?
↧
How to get a list of all indexed page URLs by Google?
↧