But the blogger's world can be lonelier than the forum owner's. In part this has to do with stats.
'Site stats' purport to show how many visitors your site is getting. In theory, each time a hit is registered, it means a denizen of the WWW has clicked on your site to read it. Unfortunately, that's not the whole story. It's a bit complicated, but there are two main methods of obtaining site stats: Logfile analysis and page tagging. Both have advantages and disadvantages, but the site owner wishing to sell advertising on the back of his stats is not going to concern himself with those. However, they are very important because of what's called the 'hotel problem', ironically.
The way to picture the situation is by imagining a hotel. The hotel has two rooms (Room A and Room B).
As the table shows, the hotel has two unique users each day over three days. The sum of the totals with respect to the days is therefore six.
During the period each room has had two unique users. The sum of the totals with respect to the rooms is therefore four.
Actually only three visitors have been in the hotel over this period. The problem is that a person who stays in a room for two nights will get counted twice if you count them once on each day, but is only counted once if you are looking at the total for the period. Any decent software for web analytics should sum these correctly for whatever time period, thus leading to the problem when a user tries to compare the totals.
This is only one problem among many for web stat users, and forums are particularly prone to various errors of magnitude when they try to use stats to sell advertising. A forum with only twenty members can produce stats showing in excess of 20,000 hits per month when, in reality, the stat system is counting every page, every arrival and every lookup as a unique visitor.
Then there are 'bots.
Many search engines and many companies routinely use 'netbots'; cyber programs that trawl the internet. Internet bots, also known as web robots, WWW robots or simply bots, are software applications that run automated tasks over the Internet Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human alone. The largest use of bots is in web spidering, in which an automated script fetches, analyses and files information from web servers at many times the speed of a human. Unfortunately, many stats counters don't differentiate between an electronic visitor and a human.
This is why making money from an internet forum is usually a forlorn hope, unless you're in the same league as Google or the Japanese. Site statistics showing apparent 'hits' can be all but meaningless.
No comments:
Post a Comment