There is growing concern among industry watchdogs that the traffic on various traffic monitors is not completely legitimate. These groups claim that the various ways of measuring traffic not only conflict with each other, but can be flat out wrong.
The high end measurements of traffic patterns and usage are taken the same way as television statistics – through samples. And those samples just aren’t doing a very good job. When you watch television, there are only so many things and channels you can watch. With the internet, the opportunities and activities online are endless.
Traffic monitors such as Netratings and comScore rely on their own versions of sample data to determine traffic patterns and rankings. The websites monitored, often high end sites, often notice that the supposed traffic coming to the site, which determines rankings, does not match their own server records. In many cases the numbers are very different.
For most webmasters, this sort of thing may be a nonevent. Most websites are not affected by Neilsen’s ratings. Their own traffic patterns are easily measurable through Statscounter or Google. But, the different measurement patterns and traffic statistics may impact the everyday webmaster more than they know.
The large measuring companies have a large sample, or panel, they follow. Unfortunately, however, all of the sites visited by these users don’t count in the statistics. Nonmedia sites and small niche websites are often left off the list completely or the resulting traffic to these sites is skewed.
But the problem that might really affect webmasters is not what the big companies are doing, but what the visitors are doing. Many of the traffic monitoring programs for servers rely on cookies. These cookies remain on computers for awhile, but if a user clears his history and cookies, he is counted as a new visitor each time he visits the site. This can make more dramatic variations in numbers.
This cookie deleting has misled many webmasters. In some cases, as evidenced by two large studies, visitors deleting cookies have made traffic appear 150% larger than it really is. The census, or server based programs are also affected by malware and multiple people on one computer or one person on multiple computers. In short, traffic reports aren’t accurate.
The various methods of computing traffic numbers impact websites in different ways. Large websites draw large advertising accounts. Those accounts request traffic reports from the large monitoring companies. As previously discussed, these reports can be blatantly wrong, but the website must rely on the reports as a legitimate third party source. Even if the potential advertiser pulls from multiple sources, all sources can conflict and all can still be wrong.
Considering advertisers pay based on traffic, this can have a very meaningful impact on the bottom line. Presumably the impact trickles down and affects advertising rates at all levels. After all, whose traffic report can you really believe?