Noise or Music? - The Insights Blog

Hosted v Software v Hybrid tools

October 7, 2007 / Categories: Google Analytics specific, Metrics understanding, Urchin software specific / Comments: 7

Share Button

My colleague Avinash recently presented at SES San Jose his thoughts on the current vendor space including: Visual Sciences, Omniture, IndexTools, Clicktracks, WebTrends and Google Analytics. As always, his talks are very engaging and thought provoking. For me though, one slide really stood out – the idea that a HYBRID web analytics tool can’t hunt – you need to view his presentation to follow that, but essentially the analogy is that HYBRIDs are not good as a web analytics tool. As Avinash knows, I disagree with this point of view, so I wanted to explain why here.

By HYBRID tool, what is generally meant is the combination of the page tagging technique combined with logfile data to produce cookie fortified logfiles. This was discussed in a white paper before I joined Google – Web Analytics Data Sources . There are significant advantages to doing this as shown in the diagram below. Essentially a hybrid allows you combine the benefits of both techniques to give you the most complete picture of visitor activity on your web site.

Hosted v Software v Hybrid tools

Key HYBRID benefits over and above a page tag only system include:

  • You own the collected data in the most direct sense of the word and can therefore reprocess it at will
  • Being able to track search engine robot activity
  • All downloaded files are tracked automatically without any modification of page html content
  • Partial file downloads can be tracked e.g. partial views of PDF files
  • Error pages can be tracked automatically without any modification of page html content

So a HYBRID technique offers real benefits. However, "with such great power comes great responsibility" (Spiderman!) which for a HYBRID web analytics tool means you take responsibility for:

  • Applying HYBRID software updates
  • Archiving and compressing your logfiles (which get very large very quickly)
  • Protecting end-user privacy – you have a legal responsibility to protect the privacy of your visitors and store logfile data securely.

HYBRIDS require a significant IT investment to run smoothly, which many organisations struggle to justify – hence the proliferation of page tag technique adoptions . Nonetheless, a HYBRID method remains an effective technique for improving the accuracy of either a page tag or logfile solution.

Are you using (or have used) a HYBRID method or perhaps some other technique to improve accuracy? Share your thoughts with a comment.

Share Button

Comments

  1. Brian Clifton said it. Urchin 5 is a hybrid analytics tool. Their page tagging technology is referenced into the log files and thing churns out statistics pretty well. You can have both Google Analytics running and Urchin 5, so you effectively keep your data on your side, and benefit from having GA’s technologies.

    Google released a beta for Urchin 5’s upgrade (long overdue) and I’m looking for information about it.

    Granted, I have them implemented on very small websites. I do know of one client that had such tremendous amount of traffic, that he couldn’t run logfile analysis at all.

    It was like turning on a kitchen faucet and having it explode with the force of a fire hydrant. It would have required serious hardware investments to make that happen, so they used their own page/event tagging methods to just focus specifically on areas they needed.

    It would be great to see benchmark tests done on these things, if anyone knows of any attempts, do tell.

    Ernesto

  2. Hi Michael

    I too am surprised that Nielsen//Netratings and Hitwise have not made more out of the tools they acquired – Red Sheriff and Hit Dynamics respectively. Closing the loop with on-site and off-site metrics is something marketers have been crying out for such a long time.

    Can you shed some more light on your experience. Was it a lack of interest, vision or expertise that led to the lack of development?

    Brian

  3. Michael Feiner says:

    Hi Brian,

    I agree and would go further – web analytics cannot achieve everything. I often find that survey data or usability testing could provide as good a starting point as web analytics data.

    I’ve had the vision of aggregator tools ever since joining Nielsen//NetRatings in 2004. I thought (and probably still do to a certain degree) that as the proprietor of so many different online research tools, Nielsen would be in pole position to develop an aggregator tool.

    Sadly, the company failed to make the most out of the opportunity and has suffered both in the web analytics and competitive analysis markets (at least they are recouping on the latter).

    I like Omniture’s strategy of developing/acquiring analytics, survey, BT, and multivariate testing platforms.

    But with the online marketing evolving at such speed I’m not sure any company would be willing to invest in developing an aggregator tool that might, in part, become obsolete very quickly.

    Actually, I can think of one company that might. That little start up you work for… What was their name again?
    ;-)

    Thanks,
    Michael
    AEP Convert

  4. Hi Michael

    You make an excellent point. However I don’t think any one web analytics tool can achieve everything. The metrics that impact the performance of a commercial web site not only includes visitor data, but also your web server performance (uptime, download speed), as well as off-site factors such as search rankings (paid and non-paid), online reputation (buzz), off-line marketing campaigns etc. Apart from these metrics for your own web site, you also need to consider competitor activity.

    So from a marketers perspective, web analytics is currently just one piece of the jigsaw. Ultimately, what I think will happen is that aggregator tools will start to appear that bring all the disparate data into one place. That way, marketers will be able to overlay a print ad campaign with web visitor activity for example.

    I am looking forward to that day…

  5. Michael Feiner says:

    Hi Brian,

    A little late into the discussion but here, nonetheless. :-)

    Interesting post – not your common Web Analytics 2.0 topic.

    Can the hybrid approach really justify itself? Is it not possible for tagging solutions to overcome some of the tracking issues?

    For example, Nedstat already measures search bots activity and reports on it separately.
    Nielsen//NetRatings track bandwidth, albeit they should update their categorisation.

    There are other tools than can measure bandwidth (unless, of course, you’d like to segment visitor traffic based on bandwidth – interesting idea).

    Not ruling it out but, as you mentioned, I think the cost and resource requirements make it too prohibitive at this point in time.

    Thanks,

    Michael Feiner
    AEP Convert

  6. Avinash: don’t for get that Urchin is a HYBRID tool. Although it can be configured to run in many different ways, the HYBRID approach is the only one I recommend.

    In fact, Urchin can be run in a TRIBRID way, that is streaming the same visitor data to Google Analytics and Urchin at the same time.

    [updated 18-Oct-2007: See related post on configuring GA to also collect data locally for Urchin processing – http://www.advanced-web-metrics.com/blog/2007/10/17/backup-your-ga-data-locally/ ]

  7. Brian: Hopefully I did not say “not good”.

    My belief is that they are really hard to pull off successfully. I have yet to see one that has been successfully implemented and provides actionable insights (though this could simply be my lack of exposure).

    Like packet sniffing I think there are a number of benefits hybrid models bring to the table. At some point the market will value the benefits (which you outline nicely) enough to have vendors provide easy to implement and use solutions.

    The only production model hybrid web analytics solution costs a quarter million US dollars entry price point (with base features). A Fortune 100 client of mine has been implementing it for 18 months now and they are still not done. This problem needs to eliminated (both the implementation part and the cost).

    My suspicion is that as the Web matures we’ll see more demand (due to the benefits) and more vendors will step up to the plate.

    -Avinash.

Add Comment Register



Leave a Reply

Your email address will not be published. Required fields are marked *


4 × 7 =

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

© Brian Clifton 2015
Best practice privacy statement