View Single Post
Old 07-14-2001, 12:22 PM  
Wilber
Confirmed User
 
Join Date: May 2001
Location: De,Oh,Lei
Posts: 1,295
If I had a list of urls I had to visit and I didn't want to get
auto'paged or auto'marked I'd download the page instead
of going to the url. Then just use BKReplace or another text
searcher and look for "JScript.Encode" or "*.js" or
"com.ms.activeX.ActiveXComponent".

You'd be able to combine a harvester and a dir search script
so all you would have to actually do is plug in the urls...or...
Perl or PHP could create a flatfile that the harvester could
use so you would not even have to plug in the urls.

It would all be automatic and all you would have to do as a
gallery reviewer is execute the "checker".

Wilber is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote