What Site Stats Are Accurate? Google Analytics Or Server Logs?
Posted 28 January 2012 - 12:56 PM
Other places base decisions on Alexa stats, which are not real accuate data.
Some companies want just Google Analytics data. We collect GA data and its showing at least 1/3 difference than our server logs. Again, going by GA would be incorrect regarding page views, unique visits, etc.
What do you rely on? GA data only or your server logs? Which is more accurate and fair?
Posted 28 January 2012 - 01:03 PM
I don't remember how Clicky compares to GA in this regard.
Posted 28 January 2012 - 02:14 PM
* many bots are impossible to detect by user agent string and need to be detected/recognised by (1) IP and/or (2) behaviour.
* a forum is rather unique in that some/many users visit multiple times in a time period and this needs to be accounted for.
Selling direct advertising is never easy and always painful
Posted 28 January 2012 - 03:24 PM
Posted 28 January 2012 - 05:42 PM
Posted 29 January 2012 - 01:13 PM
Like, I think page views are perfect for branding but unique visitors would be better for product ads.
Posted 29 January 2012 - 02:56 PM
Posted 29 January 2012 - 03:48 PM
There are several methods of gathering statistics that can be used for reporting site traffic and usage. The top three are analyzing web server access logs, third party tracking requests and third party trackware. Each have pros and cons in the accuracy of data gathered and depth of detail.
Using the log files recorded directly by your web server.
Pros - The greatest depth of detail of nearly 100% of every page or file requested from your web server, no middleware required
Cons - Too much detail, requires tools like “Weblog Expert” or deep knowledge of web servers to parse out what you want from what you don’t
Using a third party request to track pages that have tracking code embedded in the page
Pros - Third party tracks and reports on the details sent from the tracking code, third party typically maintains greater expertise in reporting and tools
Cons - More points of possible data loss, requires the page to have the proper tracking code embedded, requires the page to load the tracking request on the client (browser), requires the client to have no impairments to the third party tracking server (firewall or blocking software)
Using third party client software to record and report client (browser) activity
Pros - Records a greater depth of activity outside of your site as well as more demographics about the user and user behavior
Cons - Requires the client software to be functional, smaller sample set of all your sites visitors, user base or demographic my not represent typical visitors to your site, the software could be viewed as malicious by antivirus or antispyware software
Each has its use and there is no reason to stick with only one method as you will get a greater understanding from different perspectives, keep in mind the source and always question the validity of any data.
Posted 29 January 2012 - 05:31 PM
Any members selling ad space on their sites? What stats do you offer potential advertisers?
I hesitate to answer because I operate rather differently, certainly at a different level than most sites. However....here goes
Some basic but oft overlooked points:
* each vertical - and sometimes even niches within a vertical - has different base advertising requirements/expectations.
* further, within that vertical there are local, regional, national requirements/expectations differences.
* not all categories or even pages within categories deliver the same value to a given advertiser's campaign goals.
* each ad size and format type has a different response rate so differing price points.
* each page has limited real estate available and not all spots are created equal.
* how the ad (link, text, image, multi-media, combo) connects directly/indirectly the visitor to the advertiser can significantly affect conversion rates.
* how an ad is considered 'read': primarily CPM or CTR affects pricing.
* where the site ranks in comparison to competitors in traffic volume generally.
* determining your audience market segmentation and how each tends to convert on the web and, if you have the data, on your site.
What this means is that setting ad pricing is not a simple plug and play at some set rate - except as bog standard default.
To maximise ad revenue one needs to transform a lot of data into a solid sales presentation. And then not to under value what you have to offer.
At the end of the day it comes down to (1) how well you sell the advertiser and the (2) how well your visitors perform for that advertiser.
Note: it is always better to under promise and over perform.
What I share with advertisers initially are some general stats:
* visitor market segmentation, with emphasis on advertiser's preferred target segment(s).
* site traffic gross and net (minus bots).
* category net traffic and impressions (page views).
* category average ad conversion rates by type, size, position.
* visitor segmentation conversion rates.
* exemplified site advertising campaigns.
* exemplified pricing matrix.
At the final customised presentation:
* specific offering by page and ad format, size, and position.
Note: I do not but for those who do: if rolling various advertisers at a position include individual ad frequency.
* include breakout of page match quality.
Note: I use a three level each with three sub-levels;
---A: specific match to ad content, demographic target.
---B: broad/general/associated matches
Note: that's another 9-pricing differences...
* include viewer number and/or conversion guarantees where appropriate.
* everything else that goes in a contract.
I've left a lot out - some because I'm selfish, and some because I missed it. A site marketing plan, revenue model is the guts of the business and therefor as simple or as complex as the the target advertisers expect and the webdev can imagine.
Posted 30 January 2012 - 06:10 PM
However, server-side logs have to be scrubbed or you have to actively identify and block rogue crawlers or they'll inflate your page views. I have pretty much given up trying to use server side logs for monthly traffic analysis because cloud services like Amazon have flooded the Web with crawlers from around the world.
I just tried to document the AWS IP address blocks again today and ended up with a couple hundred lines. Somehow, Google Analytics seems to filter out a lot of the crawlers. I just wish it would track and report all the legitimate traffic too but that is probably asking too much.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users