Quote:
Originally Posted by borked
well, to be fair, this is running on a dev server that is about 5 years old. Shit, it takes me about 15 minutes to compile ffmpeg whereas on a new quad core, it's compiled in ~15 secs.
Plus, there could be many instances of the app running - no need for a single fork to be doing all the work...
And of course, you would only search for comparisons after searching for pages that flag your keywords... you wouldn't have to search ALL photos on ALL sites
|
Is is possible to estimate it's maximum productivity on a good modern server, and take into the account both server and script optimization (multi threaded mode or some other tricks)?
I understand that you need to optimize where you search and what finds you're going to compare, but still - we have a database of about one million pictures to protect, camparing against this db by 1Kpics/hour is kinda not going to work. Even 100K/hour not going to work.