I wasn’t really sure of where to post this query, but thought it might lend itself to scripting. I’m wanting to cycle through a series of URLs and download/save all of the web pages. These links are cgi links, with pages generated at the server. I know that there are about 70 pages in the “series”, with their URLs varying only by one number. For example:
http://www.somesite.com/cgi-bin/showlist.p…s&Showpage=*&Listings=30
The * would take on any value from 0 to (about) 70, for the pages of interest.
I know that I can type this to a browser (say, with *=34) and get the correct page. But efforts to cycle through the series with a download manager won’t work, for whatever reason. I’d like a script to do the looping and downloading. I’d only need to save the HTML on the page (graphics and anything else not needed).
Any help with this would be appreciated.
Alan