Quote:
Originally Posted by Nautilus
Is is possible to estimate it's maximum productivity on a good modern server, and take into the account both server and script optimization (multi threaded mode or some other tricks)?
I understand that you need to optimize where you search and what finds you're going to compare, but still - we have a database of about one million pictures to protect, camparing against this db by 1Kpics/hour is kinda not going to work. Even 100K/hour not going to work.
|
brandonstills gave me an idea in another thread...
and yes, I've sped things up by 1000x
this is having pre-hashed all your images (which is kinda slow at ~0.1/sec, but new images can be easily added to the db, just the initial compile is going to be slow) and then comparing 1 image against this hash db:
Quote:
time taken to compare 55 pre-hashed images: 0.120398044586 seconds (~1644544 images/hr)
|
And that is on my crappy PowerEdge 1850
About as good as it's going to get
--edit
if someone wants to verify that calculation of images/hr based on
intval( 3600 / ( (1 / 55) * $time )
cos my brain is really fried!