Jump to content


Web Site Design, Usability, SEO & Marketing Discussion and Support


What Site Stats Are Accurate? Google Analytics Or Server Logs?

Recommended Posts

There are some interesting things I've learned lately since getting into advertising publishing. The forums was turned down by one of the better known ad pubishing companies because it claimed the forums didn't get enough traffic. They don't say where they got their numbers from but at nearly 8 million page views in 2011 (excluding spiders), they are wrong.


Other places base decisions on Alexa stats, which are not real accuate data.


Some companies want just Google Analytics data. We collect GA data and its showing at least 1/3 difference than our server logs. Again, going by GA would be incorrect regarding page views, unique visits, etc.


What do you rely on? GA data only or your server logs? Which is more accurate and fair?

Share this post

Link to post
Share on other sites

For typical numbers, server logs that track usual visitor data. For analysis, I use web analytics (GA or Clicky).


I don't remember how Clicky compares to GA in this regard.

Share this post

Link to post
Share on other sites

Many/most companies are incredibly ignorant. Not that one can actually tell them so. :)


Most analytics programs rely on Javascript and have certain (and different) built in default biases so that their stats are never really accurate. Server logs are by far the best source but they do have to be 'cleansed', i.e. of bots, and that raises the question of how you are sure (1) that you got, if not all, at least most of them and, (2) how you read/associate the remainder.


For example:

* many bots are impossible to detect by user agent string and need to be detected/recognised by (1) IP and/or (2) behaviour.

* a forum is rather unique in that some/many users visit multiple times in a time period and this needs to be accounted for.


Selling direct advertising is never easy and always painful :(:D

Share this post

Link to post
Share on other sites

a friend who sells ads provides just unique visitors and pageviews per visitor. The art it seems is in the selling of course.

Share this post

Link to post
Share on other sites

server because it logs everything, analysing is the problem

Share this post

Link to post
Share on other sites

Any members selling ad space on their sites? What stats do you offer potential advertisers? What are advertisers looking for?


Like, I think page views are perfect for branding but unique visitors would be better for product ads.


Teach me! :infinite-banana:

Share this post

Link to post
Share on other sites

The only thing I've ever had advertisers ask for is # of ad impressions. So, that might equal page views, if the ad is shown once on every page. But if the ad shares space in a rotation, or is only on certain pages, then that wouldn't apply. In any case, the general question seems to be "how many times will my ad be shown each period?"

Share this post

Link to post
Share on other sites

I just had a long conversion last week regarding tracking users and usage of the sites that my company hosts. Thought I’d just hit the highlights of that conversation considering its relevance to the topic.


There are several methods of gathering statistics that can be used for reporting site traffic and usage. The top three are analyzing web server access logs, third party tracking requests and third party trackware. Each have pros and cons in the accuracy of data gathered and depth of detail.


Access logs:

Using the log files recorded directly by your web server.


Pros - The greatest depth of detail of nearly 100% of every page or file requested from your web server, no middleware required

Cons - Too much detail, requires tools like “Weblog Expert” or deep knowledge of web servers to parse out what you want from what you don’t


Tracking requests:

Using a third party request to track pages that have tracking code embedded in the page

Pros - Third party tracks and reports on the details sent from the tracking code, third party typically maintains greater expertise in reporting and tools

Cons - More points of possible data loss, requires the page to have the proper tracking code embedded, requires the page to load the tracking request on the client (browser), requires the client to have no impairments to the third party tracking server (firewall or blocking software)



Using third party client software to record and report client (browser) activity

Pros - Records a greater depth of activity outside of your site as well as more demographics about the user and user behavior

Cons - Requires the client software to be functional, smaller sample set of all your sites visitors, user base or demographic my not represent typical visitors to your site, the software could be viewed as malicious by antivirus or antispyware software


Each has its use and there is no reason to stick with only one method as you will get a greater understanding from different perspectives, keep in mind the source and always question the validity of any data.

Share this post

Link to post
Share on other sites

Any members selling ad space on their sites? What stats do you offer potential advertisers?



I hesitate to answer because I operate rather differently, certainly at a different level than most sites. However....here goes :)


Some basic but oft overlooked points:

* each vertical - and sometimes even niches within a vertical - has different base advertising requirements/expectations.


* further, within that vertical there are local, regional, national requirements/expectations differences.


* not all categories or even pages within categories deliver the same value to a given advertiser's campaign goals.


* each ad size and format type has a different response rate so differing price points.


* each page has limited real estate available and not all spots are created equal.


* how the ad (link, text, image, multi-media, combo) connects directly/indirectly the visitor to the advertiser can significantly affect conversion rates.


* how an ad is considered 'read': primarily CPM or CTR affects pricing.


* where the site ranks in comparison to competitors in traffic volume generally.


* determining your audience market segmentation and how each tends to convert on the web and, if you have the data, on your site.


What this means is that setting ad pricing is not a simple plug and play at some set rate - except as bog standard default.


To maximise ad revenue one needs to transform a lot of data into a solid sales presentation. And then not to under value what you have to offer.


At the end of the day it comes down to (1) how well you sell the advertiser and the (2) how well your visitors perform for that advertiser.

Note: it is always better to under promise and over perform.


What I share with advertisers initially are some general stats:

* visitor market segmentation, with emphasis on advertiser's preferred target segment(s).

* site traffic gross and net (minus bots).

* category net traffic and impressions (page views).

* category average ad conversion rates by type, size, position.

* visitor segmentation conversion rates.

* exemplified site advertising campaigns.

* exemplified pricing matrix.


At the final customised presentation:

* specific offering by page and ad format, size, and position.

Note: I do not but for those who do: if rolling various advertisers at a position include individual ad frequency.

* include breakout of page match quality.

Note: I use a three level each with three sub-levels;

---A: specific match to ad content, demographic target.

---B: broad/general/associated matches

---C: remainder.

Note: that's another 9-pricing differences...

* include viewer number and/or conversion guarantees where appropriate.

* everything else that goes in a contract.


I've left a lot out - some because I'm selfish, and some because I missed it. :) A site marketing plan, revenue model is the guts of the business and therefor as simple or as complex as the the target advertisers expect and the webdev can imagine.

Share this post

Link to post
Share on other sites

Google Analytics is the lowest common denominator. it's not very active on a high-volume Website.


However, server-side logs have to be scrubbed or you have to actively identify and block rogue crawlers or they'll inflate your page views. I have pretty much given up trying to use server side logs for monthly traffic analysis because cloud services like Amazon have flooded the Web with crawlers from around the world.


I just tried to document the AWS IP address blocks again today and ended up with a couple hundred lines. Somehow, Google Analytics seems to filter out a lot of the crawlers. I just wish it would track and report all the legitimate traffic too but that is probably asking too much.

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now