Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 03-02-2005, 12:49 PM   #1
Dusen
Confirmed User
 
Join Date: Aug 2002
Location: East Coast.
Posts: 2,251
MySQL vs. Plain text Logs - Help I am Ignorant.

I'm logging surfers to plain text files. These files are running megabytes/day. IP referrer, time using php's fopen fwrite fclose.

Other than the obvious benefits to searching the logs with queries instead of parsing text files, I have a few questions for you SQL gurus.

How would the same info logged to MySQL database compare in size?

How would the speed compare?

What problems could I run into that I'm not considering?

Basically, is it overall worth my while to convert the logging to SQL? I'm totally ignorant about SQL databasing when it comes to high speed adds/queries. All I have ever used it for is non-intensive size or speed wise applications.

Thanks Guys
__________________
Penis Pill Sponsor List and Reviews:
http://www.pillsponsors.com
Dusen is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 01:10 PM   #2
colpanic
Confirmed User
 
Join Date: May 2004
Location: San Diego, CA
Posts: 1,007
What kind of logs, just http logs?

Mysql is good if you want to run any sort of reports off of things in the long term. you can do it with plain text files, but it gets slow... you couldn't really do good dynamic reports off of them.

obviously sql has more overhead than writing to a plain text file.
__________________
I like ducks.
colpanic is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 01:12 PM   #3
VideoJ
Confirmed User
 
Join Date: Aug 2002
Location: S. Florida
Posts: 750
The data itself will be around the same size, assuming you have the same fields. The indexes, which are what make the searches in a database much quicker, will add space to your database, making it larger then the text files. How much larger will depend on whiat columns you index.

The speed will depend on the searches and the indexes you set up. The worse case is the searches will take roughly the same amount of time. The best case is where you have an index on a column and you do a search on that column, it should be much quicker.
__________________
Somebody stole my damn signture...
VideoJ is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 01:24 PM   #4
Dusen
Confirmed User
 
Join Date: Aug 2002
Location: East Coast.
Posts: 2,251
Thanks for the replies guys, I'll add a few things I thought of after reading your replies:

My main concern is runtime execution - not report speed. The report generation speed is really only a bonus from my perspective - currently I just parse the daily text files ( I have a cronjob that renames them every night at midnight). and then using explode and php get specific data from that. But I don't mind waiting extra time to see a report if the surfers / stats are getting through faster.

As far as runtime - I'm currently running 3 php fopens and closes, and fwrites in the middle different stats to each file. And yes, it's basically just httpd log stuff with a few extras I have. The benefit I see there is only one mysql_connect statement with multiple table writes. When the surfer visits I would be doing NO SELECT statements, only writes.

In fact, I just realized that I actually am already opening the mysql connection for an unrelated thing each time the surfer visits anyway - so this may certainly speed things up.

But another thing I was thinking of is that perhaps the mysql could handle several simulatneous writes better than writing to a file? Does MySQL spool requests? I don't think PHP and file writes does. Seems sloppy there for me.

Is there a limit to a mysql database size? I can easily see it reaching the gigabyte size down the road.

Thanks for entertaining these questions.
__________________
Penis Pill Sponsor List and Reviews:
http://www.pillsponsors.com
Dusen is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 01:32 PM   #5
NetRodent
Confirmed User
 
Join Date: Jan 2002
Location: In the walls of your house.
Posts: 3,985
Why are you using php to write the logs instead of using the raw webserver logs? If you're looking for write efficiency, not much will beat direct logging from the web server.

Direct logging to a database is probably not a good idea.
__________________
"Every normal man must be tempted, at times, to spit on his hands, hoist the black flag, and begin slitting throats."
--H.L. Mencken
NetRodent is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 02:28 PM   #6
Dusen
Confirmed User
 
Join Date: Aug 2002
Location: East Coast.
Posts: 2,251
It's not just webserver logs - I have other things that I'd like to be able to reference to surfers - tracking variables, etc. Otherwise I'd just use my apache logs for sure!
__________________
Penis Pill Sponsor List and Reviews:
http://www.pillsponsors.com
Dusen is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 03:05 PM   #7
borked
Totally Borked
 
borked's Avatar
 
Industry Role:
Join Date: Feb 2005
Posts: 6,284
log everything and everything you want directly to an appended flat text file, then when you want to process the logs for anything, let php parse it and log whatever you need to into a mysql fatabase - never log access-type logs to mysql. You'll regret it once your site starts getting decent traffic.
__________________

For coding work - hit me up on andy // borkedcoder // com
(consider figuring out the email as test #1)



All models are wrong, but some are useful. George E.P. Box. p202
borked is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 03:32 PM   #8
MickeyG
Confirmed User
 
Join Date: May 2004
Location: South Florida
Posts: 4,134
better make sure to read up on file locking too.
MickeyG is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 03:45 PM   #9
borked
Totally Borked
 
borked's Avatar
 
Industry Role:
Join Date: Feb 2005
Posts: 6,284
if ($fp)
{
@flock ($fp, LOCK_EX);
@fwrite($fp,$output);
@flock ($fp, LOCK_UN);
@fclose($fp);
}

But that's for wimpy sites - if you've got serious traffic, the logs need caching to RAM and writing out every second or so by a seperate process. No locking needed and far superior performance.
__________________

For coding work - hit me up on andy // borkedcoder // com
(consider figuring out the email as test #1)



All models are wrong, but some are useful. George E.P. Box. p202
borked is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 04:05 PM   #10
SicChild
Confirmed User
 
Join Date: Mar 2003
Posts: 365
If you know what you are doing sql would be a MUCH better solution.

You can archive past data and summarize the info so filesize is much smaller.
SicChild is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 04:14 PM   #11
borked
Totally Borked
 
borked's Avatar
 
Industry Role:
Join Date: Feb 2005
Posts: 6,284
Quote:
Originally Posted by SicChild
If you know what you are doing sql would be a MUCH better solution.

You can archive past data and summarize the info so filesize is much smaller.
You clearly DON'T know what you're doing to offer that advice.
__________________

For coding work - hit me up on andy // borkedcoder // com
(consider figuring out the email as test #1)



All models are wrong, but some are useful. George E.P. Box. p202
borked is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 04:15 PM   #12
Babaganoosh
♥♥♥ Likes Hugs ♥♥♥
 
Babaganoosh's Avatar
 
Industry Role:
Join Date: Nov 2001
Location: /home
Posts: 15,841
Quote:
Originally Posted by borked
log everything and everything you want directly to an appended flat text file, then when you want to process the logs for anything, let php parse it and log whatever you need to into a mysql fatabase - never log access-type logs to mysql. You'll regret it once your site starts getting decent traffic.
This is exactly right. Maybe set up a script on a cron job that will import all of the data into mysql and then use mysql to search/generate stats. I definitely wouldn't try to insert data into mysql live. That would most likely cause you some serious problems.
__________________
I like pie.
Babaganoosh is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 04:42 PM   #13
MickeyG
Confirmed User
 
Join Date: May 2004
Location: South Florida
Posts: 4,134
Quote:
Originally Posted by borked
if ($fp)
{
@flock ($fp, LOCK_EX);
@fwrite($fp,$output);
@flock ($fp, LOCK_UN);
@fclose($fp);
}

But that's for wimpy sites - if you've got serious traffic, the logs need caching to RAM and writing out every second or so by a seperate process. No locking needed and far superior performance.

I love ramdisks, especially for my mail queues
MickeyG is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 04:55 PM   #14
just a punk
So fuckin' bored
 
just a punk's Avatar
 
Industry Role:
Join Date: Jun 2003
Posts: 32,384
Text logs are much faster than MySQL.
__________________
Obey the Cowgod
just a punk is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 05:03 PM   #15
Babaganoosh
♥♥♥ Likes Hugs ♥♥♥
 
Babaganoosh's Avatar
 
Industry Role:
Join Date: Nov 2001
Location: /home
Posts: 15,841
Quote:
Originally Posted by cyberxxx
Text logs are much faster than MySQL.
Not always true. Generate detailed stats from a 300 MB text file and then do the same with a well-indexed mysql db. Mysql will win every time.
__________________
I like pie.
Babaganoosh is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 05:41 PM   #16
Dusen
Confirmed User
 
Join Date: Aug 2002
Location: East Coast.
Posts: 2,251
That ramdisk/sql cron idea is fantastic. Fantabulous. I wish I had thought of it myself.
__________________
Penis Pill Sponsor List and Reviews:
http://www.pillsponsors.com
Dusen is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 05:52 PM   #17
jwerd
Confirmed User
 
Industry Role:
Join Date: Jun 2003
Location: Costa Rica
Posts: 1,953
I tend to use SQLite a bit more than MySQL, if it basically varies between using a flatfile verse database. SQLite is far less overhead (within reason) and allows for a little bit better control, if you are into parsing a bit of data. Of course like most of the people have already said, flatfile is the way to go most definately... but before you even consider to go with MySQL, consider SQLite first Just my
__________________
Yii Framework Guru - Seasoned PHP vet - Partner @ XXXCoupon.com
jwerd is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 06:13 PM   #18
NetRodent
Confirmed User
 
Join Date: Jan 2002
Location: In the walls of your house.
Posts: 3,985
Quote:
Originally Posted by Armed & Hammered
Not always true. Generate detailed stats from a 300 MB text file and then do the same with a well-indexed mysql db. Mysql will win every time.
If you only need a small portion of the data in the 300 MB, a well indexed db is better. If you need to process all 300 MB, plain text is probably faster.
__________________
"Every normal man must be tempted, at times, to spit on his hands, hoist the black flag, and begin slitting throats."
--H.L. Mencken
NetRodent is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 03-02-2005, 06:44 PM   #19
skillfull
Confirmed User
 
skillfull's Avatar
 
Industry Role:
Join Date: Apr 2003
Location: Quebec Calisse
Posts: 4,716
Quote:
Originally Posted by NetRodent
If you only need a small portion of the data in the 300 MB, a well indexed db is better. If you need to process all 300 MB, plain text is probably faster.
what he said
__________________
mind at underdark dot cc
SEO Analyst
Thunder-Ball.net - Member
skillfull is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.