![]() |
PHP Question
I've done a sponsor click tracking script with a MySQL database, it works so far, but is this the efficient code (least strain on the server, etc.)
It will be called up 10,000/day. go.php Code:
<? |
if you just want to count clicks to sponsors i would use plain txt files instead of sql :winkwink:
|
Or use any of the countless scripts already out there that do this.
|
Thanks for answering my question....not..
|
Not bad...the server can handle it......best way to do it is to store everything in a txt file and run a cron tab every 3-4 minutes that takes it, update the DB, and clean the txt file....so you reduce the # of queries/day drastically
|
Quote:
I ripped out code from TGP Rotator and edited it... This part of the code I think is slightly wrong and could be done another way? Code:
$result = mysql_query("SELECT URL FROM Sponsors WHERE Domain='$domain'") or die ("Invalid entry!"); |
Quote:
well the code itself is good! but it depends...not sure what you want to do ;) |
As long as you're using an index on the Domain field, all should be good
|
Quote:
PHP Code:
|
yea i was gonna say use mysql_result
|
Thanks peeps... :thumbsup
|
at 10k hits per day, you shouldn't have any problems with that script...
|
change mysql_connect to mysql_pconnect -- this will enable persistent connections and allow you to bypass the setup/teardown for mysql connections increasing the speed of your script significantly.
|
Quote:
|
Quote:
and the speed increase is not significant at all, the bottleneck is the write operations for each hit. you can still do a million hits a day with a beefy server though. so why fix it if it aint broken. if you want real optimization for high load, u're gonna need a few more lines of code than that. |
10,000 transactions per day should not be a problem for MySql.
1) Make sure you put an index on the key search field. This will increase performance more than anything else. 2) Performance can be enhance by tuning server parameters See Docs . 3) The new release of MySql 5.0 has a performance feature called "Stored Procedures". This lets you pre-compile your MySql query so MySql does not have to compile the same query over and over again. As this is my first posting of GFY, please forgive any glaring errors. Kind regards, Bill |
With the suggestions already added in regards to persistent connections and the setting of an index on the "domain" column, I don't have anything to add but a note of the glaring lack of error handling in your script. It shouldn't be a problem for the diligent webmaster, but there is something about the missing safety procedures that irks me any way.
I always use if(@){ // continue }else{ // handle as appropriate, whether by continued operation under alternative means or by throwing a custom error } |
on one of my bigger boxes, I do about 556.23 sql queries per minute right now... no problems.
|
Quote:
|
Quote:
ok...maybe not the "best" way.... but still works pretty good and you can process a lot of info with very few queries...and you don't lose any info at all if you do it right.... |
Quote:
|
All times are GMT -7. The time now is 06:38 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123