As stated in the original post -- I'm specifically focused on IMAGE SCRAPER BOTS. I'm not at all worried about the average user who wants to save individual images to his hard drive.
Of course I know there's no bullet-proof method to prevent someone saving images. That's not the point. (!)
I just want to prevent robo site-ripping. Or at least make it such a pain in the ass that the ripper will move on to some other site instead.
I'm looking into throttling back the allowed page views -- allow only 30 page views per minute. The average surfer would never reach that threshold, but it would frustrate a lot of scrapers.
Also... serving the largest-sized images with javascript instead of the usual <img src""> would discourage most scrapers.
I'd also love to build a killer bot trap... I'm reading up on
http://projecthoneypot.org right now.