GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Webmaster Q & Fuckin' A (https://gfy.com/forumdisplay.php?f=27)
-   -   Tracking thumbnail clicks (https://gfy.com/showthread.php?t=1036316)

xmonkeys 08-31-2011 03:01 PM

Tracking thumbnail clicks
 
Hello guys...

I was wondering if anyone knew of a software that could track where my outgoing clicks are going to? like when someone clicks on a thumbnail, and is redirected from my site to the sponsor site, how can i find out or track where it s going? I tried GA, but it doesnt seem to have anything for thumbs. If i have like 300k thumbs, I wanna be able to track where the outgoing clicks are being redirected to?

xmonkeys 08-31-2011 07:08 PM

what i mean is track ur exit clicks from the thumbs.

Babaganoosh 09-01-2011 05:17 AM

Usually a trade script will manage traffic and clicks on thumbs but an ad script handles traffic to sponsors. I wouldn't bother sending traffic from thumbs straight to sponsors though.

ManPuppy 09-01-2011 12:56 PM

I've done this myself in PHP for managing my own banners. Simple code. You can do it for thumbs if you want.

If you're versed in PHP, I can describe how I did it. If not, I can probably custom code it pretty quickly for your site for a reasonable sum. E-mail me if you haven't found a solution. support ~at~ manpuppy.com

CodeR70 09-01-2011 06:26 PM

http://piwik.org/

Goals can be combined with your traffic stats.

xmonkeys 09-02-2011 01:22 PM

Quote:

Originally Posted by Babaganoosh (Post 18393846)
Usually a trade script will manage traffic and clicks on thumbs but an ad script handles traffic to sponsors. I wouldn't bother sending traffic from thumbs straight to sponsors though.

What i mean is, my thumbs are redirected traffic, and I want to know where most of my traffic is being redirected to, so i can compare it with the stats from the site that is paying me.
I just wanna be able to make sure that their numbers correspond with mine.

xmonkeys 09-02-2011 01:24 PM

Quote:

Originally Posted by ManPuppy (Post 18397173)
I've done this myself in PHP for managing my own banners. Simple code. You can do it for thumbs if you want.

If you're versed in PHP, I can describe how I did it. If not, I can probably custom code it pretty quickly for your site for a reasonable sum. E-mail me if you haven't found a solution. support ~at~ manpuppy.com

Where you able to insert it in a bulk format, or did you have to code it one by one? I want something that I can do more in bulk, than one by one.

Babaganoosh 09-02-2011 01:37 PM

Quote:

Originally Posted by xmonkeys (Post 18399961)
What i mean is, my thumbs are redirected traffic, and I want to know where most of my traffic is being redirected to, so i can compare it with the stats from the site that is paying me.
I just wanna be able to make sure that their numbers correspond with mine.

Sounds like you're just looking for a trade script.

http://arrowscripts.com/

robber 09-02-2011 06:01 PM

hmm, what controls where the clicks are going, if it is you then all you need to do is have a file like an out.php that just logs what was clicked and then redirects the user onto the site (this is how my site deals with the thumbnail clicks), except I don't track them at the moment.

It would be simple enough to do your link could be something like:
Code:

<a href="out.php?page=#Your Link#"><img src="#image source" ..... ></a>
Then your out.php would just need to be something like: (This is not how mine works, just to tell you)

Code:

<?php

## Connect to DB ##
mysql_connect(#host#,#user#,#pass#);
mysql_select_db(#db name#);

## Do the logging then redirecting ##

$link = mysql_real_escape_string($_GET['page']); #this cleans your code
mysql_query("update linktrack set count=count+1 where link='".$link."'");
header("location: ".$link);
exit();
?>

I'm sure they would be a cleaner more efficient way of doing this I'm just not sure what it is.

If you have any issues with this or need help customising it to fit your system please free to either reply back here or to drop me an e-mail on [email protected] (replace the 0's for o's)

Kind Regards

Rob

Babaganoosh 09-02-2011 06:08 PM

Yeah...don't do that. There's definitely no reason to execute a query for every click.

Zorgman 09-02-2011 08:13 PM

Check out TALSv2. www.bigdotmedia.com/tals.php

robber 09-03-2011 12:03 PM

Quote:

Originally Posted by Babaganoosh (Post 18400478)
Yeah...don't do that. There's definitely no reason to execute a query for every click.

How would you suggest approaching logging all the clicks on the links?

Babaganoosh 09-03-2011 05:57 PM

Quote:

Originally Posted by robber (Post 18401544)
How would you suggest approaching logging all the clicks on the links?

Log clicks to a flat file then execute a cron job every x minutes to import data to mysql. To open a connection, execute a query and then close the connection just to log a click would be a colossal waste of resources. Let's say you have a modest 500k clicks a day, that would be almost 6 connections per second. With a million clicks you'd be looking at just over 11 connections per second.

robber 09-04-2011 02:22 AM

hmmm, good point, I get what you mean now, so this would be more realistic if the function was built into existing queries which are running on the same connection, so if you already needed to have the db open already for what was happening on the site.

Thanks for your insight, it gives me something to think about.

xmonkeys 09-04-2011 01:02 PM

Babaganoosh...havent seen u on icq in a while! where you been hiding? :-) heheh

Babaganoosh 09-04-2011 02:07 PM

I'm usually on invisible. ;)

My Pimp 09-05-2011 12:00 PM

I did not knew that this is possible.

idogfy 09-21-2011 06:50 AM

Quote:

Originally Posted by Babaganoosh (Post 18402036)
Log clicks to a flat file then execute a cron job every x minutes to import data to mysql. To open a connection, execute a query and then close the connection just to log a click would be a colossal waste of resources. Let's say you have a modest 500k clicks a day, that would be almost 6 connections per second. With a million clicks you'd be looking at just over 11 connections per second.

Well good point about the db connection,

Nevertheless, i wanted to add:

If you are on low traffic, this won't hurt you to do the query on every click

but i can add up quickly as pointed out by babaganoosh

but if it adds up, hitting and writing to a flat file...locking that file if you don't want to corrupt, you will probably (meaning for SURE) hurt your server performance way more than using a DB in the first place.

None solution are really scalable

If you want to host it yourself/do it yourself/don't want to use 3rd party:
=> you need memcache (or other NoSQL DB) and either a CRON job to save memcache info int o the DB, or a 'garbage collector' like mechanism like every 100 clicks, I save what is in memcache to the DB...

If you feel adventurous to trust an external party with your important data:
=>perhaps even better, use an URL shortener and get the stats from them.

If you feel adventurous but not too much, but still trust big google:
=>or if you know how to do it, use google analytics events
I can't post URL so do that search on google and you should find how to do that:
"google analytics How do I manually track clicks on outbound links? "


Hope that may help and you don't mind I jumped into the thread to add my grain of salt when nobody asked. :2 cents::2 cents:

Babaganoosh 09-21-2011 07:39 AM

Quote:

Originally Posted by idogfy (Post 18441545)
but if it adds up, hitting and writing to a flat file...locking that file if you don't want to corrupt, you will probably (meaning for SURE) hurt your server performance way more than using a DB in the first place.

You don't need to lock the file if you're just appending. :winkwink:

idogfy 09-21-2011 09:02 AM

Well, what would you append? I'm curious of your strategy

But locking or not, if traffic is a problem so that querying a DB would hit the server, my experience tell me having a single SPOF with one file, will be an equivalent bottleneck (if not worse in fact) even if you don't lock...you still need to actually write the file and the equivalent number of DB connection are translating into file write...(with all concurrent thread and/or process fighting against each other)

memcached (for example) is very well suited for counters but this just my 2 cents
and other client side and/or 3rd solution like Google Analytics / URL Shortener for sure solve the scalability problem by giving the problem to someone else :thumbsup


I'm also interested to understand how you don't lock? my understanding is that by the time you open the file in append mode, and the time you write to the file with the pointer at what you believe is the end of the file, you may very well have a race condition where in between the two instructions, some else open the file...the first append and bump the second who wants to append has the pointer not at the end of the file due to the previous write.
What am i missing here?

I'm interested to see your idea and way of doing that from a technical point of view since this would open new ideas for me, thanks. :helpme
What is your strategy, what do you append and how do you update after?

Babaganoosh 09-21-2011 01:12 PM

This isn't mission critical data. There's no reason to worry about the possibility of a minuscule number of clicks not being counted. I've done this with a 250k+ tgp with well over a mil clicks and didn't notice many issues.

This was just logging clicks to thumbnails so they could be ordered by productivity so data was collected every 5 minutes and batch inserted into mysql. The numbers in my own click count and what the trade script counted were extremely close.

If I were that worried about not missing a single click ever then memcached would certainly be an option.

Babaganoosh 09-21-2011 01:32 PM

If you're using php's fwrite then according to the docs you don't need to flock when appending.

http://us3.php.net/manual/en/function.fwrite.php

Note:
If handle was fopen()ed in append mode, fwrite()s are atomic (unless the size of string exceeds the filesystem's block size, on some platforms, and as long as the file is on a local filesystem). That is, there is no need to flock() a resource before calling fwrite(); all of the data will be written without interruption.

idogfy 09-22-2011 08:01 AM

Thanks for your input.

My experience with using a file had been different than yours, so that's great to know you had good result in real life, I shall try it again next time then. (locking aside, all write queued (by appending) had been a killer for me in the past, file access is very often my biggest performance trouble)

So i have some question :

-I did not get what you meant by : "they could be ordered by productivity" ???
-Are you not locking the file when you are reading it during your batch import ?
-You say you don't mind losing some click, which I would agree (if you have volume in the first place), so even if you are not locking, are you/could you losing some click with your solution?
-How big is the file after 5 minutes, and how long does it take to import it ?

Would you mind sharing some code showing your strategy: what and how you write and how you read an import ?

Thx again.

Babaganoosh 09-22-2011 09:36 AM

Quote:

Originally Posted by idogfy (Post 18443862)
Thanks for your input.

My experience with using a file had been different than yours, so that's great to know you had good result in real life, I shall try it again next time then. (locking aside, all write queued (by appending) had been a killer for me in the past, file access is very often my biggest performance trouble)

So i have some question :

-I did not get what you meant by : "they could be ordered by productivity" ???
-Are you not locking the file when you are reading it during your batch import ?
-You say you don't mind losing some click, which I would agree (if you have volume in the first place), so even if you are not locking, are you/could you losing some click with your solution?
-How big is the file after 5 minutes, and how long does it take to import it ?

Would you mind sharing some code showing your strategy: what and how you write and how you read an import ?

Thx again.

What I meant by ordering by productivity, was I was displaying thumbnails according to which was clicked the most. The most clicked (or productive) thumbnails always appeared higher in my listings which boosted overall productivity on my site.

Before reading the file in I would move the data files over to a temp directory for processing. I never tried to read and write at the same time. I should also mention that the thumbs were given a unique id number so the data files were unique to the thumb. E.g. 12345.jpg had all of its clicks logged to 12345.txt. I wasn't writing every click to the same file. I don't think that would have gone well at all.

But when I append a file in PHP I never explicitly flock the file. Even the php.net page on the function says it is unnecessary since fwrites are atomic if the handle was opened in append mode.

I suppose I could have been losing some click data but I never really cared enough to do any checking. My traffic script and my click logging script showed very similar numbers in respect to total clicks for the day so I just never worried about it. I do babysit my error logs just for fun and I never saw many errors related to the logging of clicks so I assume everything was working ok.

I don't really know how long it took to process the click data from each 5 minute period. I'm sure it was less than a minute or two even at the busiest times of the day but I can't tell you exactly how long it took.

As far as code goes it was nothing fancy. fopen, fwrite fclose. I just logged the ip which clicked the thumb. Same story with the importing of data. Open a file, read into an array, insert data into a table. The only thing I did that caused any kind of overhead was removing duplicate IPs from each array. I logged both total raw and total unique clicks (so whoever submitted the thumbnail couldn't increase their position by clicking their thumb over and over - at least not more than once every 5 mins).

Keep in mind that this was on a relatively expensive dedicated server too, multi processor, raid, massive ram. I wasn't doing this on a virtual account on hostgator. The scripts were never a performance issue, except when I was trying to log each click to mysql immediately. That sucked. Most of my server load came from apache serving all of those thumbnails.


All times are GMT -7. The time now is 01:00 AM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc