GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   PHP Question (https://gfy.com/showthread.php?t=457288)

nmcog 04-18-2005 03:05 PM

PHP Question
 
I've done a sponsor click tracking script with a MySQL database, it works so far, but is this the efficient code (least strain on the server, etc.)

It will be called up 10,000/day.

go.php
Code:

<?

$username = 'username';
$password = 'password';
$database = 'database';
$hostname = 'localhost';

mysql_connect($hostname, $username, $password);
mysql_select_db($database);

$domain = $_GET['site'];

mysql_query("UPDATE Sponsors SET Clicks=Clicks+1 WHERE Domain='$domain'");

$result = mysql_query("SELECT URL FROM Sponsors WHERE Domain='$domain'") or die ("Invalid entry!");
$output = mysql_fetch_row($result);
$url = ($output[0]);

mysql_close();

header("Location: $url");

?>


ezey 04-18-2005 03:10 PM

if you just want to count clicks to sponsors i would use plain txt files instead of sql :winkwink:

Fuckin Bill 04-18-2005 03:11 PM

Or use any of the countless scripts already out there that do this.

nmcog 04-18-2005 03:29 PM

Thanks for answering my question....not..

SilverTab 04-18-2005 03:32 PM

Not bad...the server can handle it......best way to do it is to store everything in a txt file and run a cron tab every 3-4 minutes that takes it, update the DB, and clean the txt file....so you reduce the # of queries/day drastically

nmcog 04-18-2005 03:39 PM

Quote:

Originally Posted by SilverTab
Not bad...the server can handle it......best way to do it is to store everything in a txt file and run a cron tab every 3-4 minutes that takes it, update the DB, and clean the txt file....so you reduce the # of queries/day drastically

I'd do that if I knew how :)
I ripped out code from TGP Rotator and edited it...

This part of the code I think is slightly wrong and could be done another way?

Code:

$result = mysql_query("SELECT URL FROM Sponsors WHERE Domain='$domain'") or die ("Invalid entry!");
$output = mysql_fetch_row($result);
$url = ($output[0]);


SilverTab 04-18-2005 03:41 PM

Quote:

Originally Posted by nmcog
I'd do that if I knew how :)
I ripped out code from TGP Rotator and edited it...

This part of the code I think is slightly wrong and could be done another way?

Code:

$result = mysql_query("SELECT URL FROM Sponsors WHERE Domain='$domain'") or die ("Invalid entry!");
$output = mysql_fetch_row($result);
$url = ($output[0]);



well the code itself is good! but it depends...not sure what you want to do ;)

pstation 04-18-2005 03:43 PM

As long as you're using an index on the Domain field, all should be good

pstation 04-18-2005 03:45 PM

Quote:

Originally Posted by nmcog
I'd do that if I knew how :)
I ripped out code from TGP Rotator and edited it...

This part of the code I think is slightly wrong and could be done another way?

Code:

$result = mysql_query("SELECT URL FROM Sponsors WHERE Domain='$domain'") or die ("Invalid entry!");
$output = mysql_fetch_row($result);
$url = ($output[0]);


you might benefit from using
PHP Code:

$url mysql_result($result0); 


Alky 04-18-2005 03:59 PM

yea i was gonna say use mysql_result

nmcog 04-18-2005 04:10 PM

Thanks peeps... :thumbsup

woj 04-18-2005 05:00 PM

at 10k hits per day, you shouldn't have any problems with that script...

zagi 04-19-2005 12:24 AM

change mysql_connect to mysql_pconnect -- this will enable persistent connections and allow you to bypass the setup/teardown for mysql connections increasing the speed of your script significantly.

Lane 04-19-2005 12:38 AM

Quote:

Originally Posted by SilverTab
Not bad...the server can handle it......best way to do it is to store everything in a txt file and run a cron tab every 3-4 minutes that takes it, update the DB, and clean the txt file....so you reduce the # of queries/day drastically

that's certainly not the 'best way'

Lane 04-19-2005 12:40 AM

Quote:

Originally Posted by zagi
change mysql_connect to mysql_pconnect -- this will enable persistent connections and allow you to bypass the setup/teardown for mysql connections increasing the speed of your script significantly.

you can easily use up all available connection if thats not setup properly.
and the speed increase is not significant at all, the bottleneck is the write operations for each hit.

you can still do a million hits a day with a beefy server though. so why fix it if it aint broken. if you want real optimization for high load, u're gonna need a few more lines of code than that.

CamelNose 04-19-2005 05:07 PM

10,000 transactions per day should not be a problem for MySql.

1) Make sure you put an index on the key search field. This will increase
performance more than anything else.

2) Performance can be enhance by tuning server parameters See Docs .

3) The new release of MySql 5.0 has a performance feature called
"Stored Procedures". This lets you pre-compile your MySql query
so MySql does not have to compile the same query over and over
again.

As this is my first posting of GFY, please forgive any glaring errors.

Kind regards,
Bill

Repetitive Monkey 04-19-2005 05:26 PM

With the suggestions already added in regards to persistent connections and the setting of an index on the "domain" column, I don't have anything to add but a note of the glaring lack of error handling in your script. It shouldn't be a problem for the diligent webmaster, but there is something about the missing safety procedures that irks me any way.

I always use
if(@){
// continue
}else{
// handle as appropriate, whether by continued operation under alternative means or by throwing a custom error
}

naitirps 04-19-2005 05:29 PM

on one of my bigger boxes, I do about 556.23 sql queries per minute right now... no problems.

pstation 04-19-2005 05:30 PM

Quote:

Originally Posted by Repetitive Monkey
With the suggestions already added in regards to persistent connections and the setting of an index on the "domain" column, I don't have anything to add but a note of the glaring lack of error handling in your script. It shouldn't be a problem for the diligent webmaster, but there is something about the missing safety procedures that irks me any way.

I always use
if(@){
// continue
}else{
// handle as appropriate, whether by continued operation under alternative means or by throwing a custom error
}

it's probably better to use the set_error_handler function because otherwise your code usually becomes cluttered with the error handling

SilverTab 04-19-2005 05:33 PM

Quote:

Originally Posted by Lane
that's certainly not the 'best way'


ok...maybe not the "best" way....

but still works pretty good and you can process a lot of info with very few queries...and you don't lose any info at all if you do it right....

Repetitive Monkey 04-19-2005 05:39 PM

Quote:

Originally Posted by pstation
it's probably better to use the set_error_handler function because otherwise your code usually becomes cluttered with the error handling

Personal tastes. I like me a sexy script filled with tab-formatted perfection.


All times are GMT -7. The time now is 06:38 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123