View Single Post
Old 09-21-2011, 06:50 AM  
idogfy
Registered User
 
Industry Role:
Join Date: Sep 2011
Posts: 12
:2cents

Quote:
Originally Posted by Babaganoosh View Post
Log clicks to a flat file then execute a cron job every x minutes to import data to mysql. To open a connection, execute a query and then close the connection just to log a click would be a colossal waste of resources. Let's say you have a modest 500k clicks a day, that would be almost 6 connections per second. With a million clicks you'd be looking at just over 11 connections per second.
Well good point about the db connection,

Nevertheless, i wanted to add:

If you are on low traffic, this won't hurt you to do the query on every click

but i can add up quickly as pointed out by babaganoosh

but if it adds up, hitting and writing to a flat file...locking that file if you don't want to corrupt, you will probably (meaning for SURE) hurt your server performance way more than using a DB in the first place.

None solution are really scalable

If you want to host it yourself/do it yourself/don't want to use 3rd party:
=> you need memcache (or other NoSQL DB) and either a CRON job to save memcache info int o the DB, or a 'garbage collector' like mechanism like every 100 clicks, I save what is in memcache to the DB...

If you feel adventurous to trust an external party with your important data:
=>perhaps even better, use an URL shortener and get the stats from them.

If you feel adventurous but not too much, but still trust big google:
=>or if you know how to do it, use google analytics events
I can't post URL so do that search on google and you should find how to do that:
"google analytics How do I manually track clicks on outbound links? "


Hope that may help and you don't mind I jumped into the thread to add my grain of salt when nobody asked.
idogfy is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook