Tag: <span>Google Analytics</span>

As the end of the year starts to draw close, inevitably thoughts turn to 2011 and the challenges that may lie ahead. From a product point of view we are starting to get a feel of how the year will shape up, and so we can start to look at how our team negotiates the (still forming) landscape.

It already looks likely that our aim (alongside keeping pace with product development) will be to get to a point where we can correctly focus our efforts around structure, and ad-hoc document requessts. Given that we have access to outcome codes from the Support team, several of which are specifically around product documentation being either wrong, missing, or not read at all, and we will soon have a full set of analytics from our online product documentation, which should put us in a much better position to correctly prioritise those additional work streams rather than fall into the “whoever shouts loudest” model we are currently prey to.

The analytics are powered by Google Analytics and track visits to each topic of the documentation set. The numbers should help point is to areas of the documentation that, for one reason or another, need some attention. This works both ways of course, a high number of views indicates a lot of people using the information, but where are they going afterwards? If they head to the Support area of the website then can we presume the information isn’t correct? And those topics with little to no views, are they not used because they can’t be found?

I’m a little wary of spending too much time analysing the statistics and initially they will be used purely to direct us to the outliers, those topics that for one reason or another are causing anomalies in the reported numbers. Once we smooth those out then it will require a lot more deep-dive style root cause analysis which, as with everything else, will bring a fresh set of challenges and hopefully some new routes of communication with our customers.

Work

Comments closed

Finding the right solution for a problem isn’t always easy but sometimes, if you are very lucky, the solution will fall straight into your lap. Such was the case with our switch to Author-it even though we didn’t fully realise it at the time.

I’ve covered our reasons for switching from FrameMaker to Author-it elsewhere, and once we had converted our content we started to look at how we could get the most from the other output formats available. We already had ideas on how we could use the provided HTML based publishing formats to provide a better solution to the problem of finding information, and we were planning on generate HTML versions of the entire documentation set to be hosted, and searchable, on our community website.

It was right about then that Author-it announced their new ‘Webhelp’ format which would include a (very) quick search in a nice modern looking format. Given that one issue we were addressing was how hard it is to search across multiple PDFs (which presumes the poor reader knows which PDF they should start with) it looked like an excellent solution.

And it is.

We now host a specific build of all of our content within our developer community (which is password protected I’m afraid so you’ll just have to trust me), which allows the developers, partners and customers, to search across everything we have. However we have had to customise the output a little to meet our needs, and this is where the hacking starts.

Let me tell you a story. In it our hero (me) fights valiantly against two Javascript dragons called Webhelp and Google Analytics. It’s a bloody battle and at the end, when all the fighting is done, well … you’ll have to read on and find out.

Some background first.

We have a developer community website which hosts downloads of our software and all the documentation in PDF format. To make it easier for people to find information in the product documentation, we also host a Webhelp version of each and every document in one master Webhelp system so you can search across the entire thing. It works really well.

To track how the other areas of the website are used, we have a Google Analytics account and the necessary code has been added. For the Webhelp, the code is in both the index.htm and topic.htm files.

But, and this is where the story begins, it doesn’t work properly.

Google Analytics will happily track every visit to the WebHelp system, but it stops there. Any click made within the system is recorded as a click but there is no detail on WHAT topic was viewed. We had hoped to get stats on this to allow us to better focus on the areas of the product people were enquiring about but we are, essentially, blind.

It’s very annoying.

Why is this so? Well I think it’s to do with the way WebHelp is created. It uses a Javascript library called Ext JS which, amongst other things, means that every time you open a topic in the Webhelp, it’s loaded through a Javascript call so Google Analytics never ‘sees’ a new HTML page (a new topic) being loaded so doesn’t know what you are viewing.

I think. I’m not 100% sure to be honest.

I’ve logged a somewhat vague Support call with Author-it, and have enlisted the help of our own webmaster. Next step will be to beg and plead with some of the developers for some of their brain power (most of them have a fair bit to spare).

It’s hugely annoying, being so close to what we want but not able to fix it myself, but sometimes you just have to admit defeat.

Of the battle, that is. I WILL win the war!

Work

Better documentation lowers support calls, is a widely held assumption and one I’m hoping to prove in the coming months. With our new knowledge centre in place, and Google Analytics tracking how many people are visiting it, I’ll soon have stats for my side of the fence.

Early numbers (from the past two weeks) show that more people are looking at the Documentation area of our website than are looking at the Support area, but then the knowledge centre (part of the Documentation area) is new so that’s only to be expected and I’m really not expecting to get a true picture of how things are going until late January next year.

Fingers crossed.

With thanks to Rachel Potts for her post on what web analytics can do for technical communications.

Work

Access

Yesterday we launched a new version of our developer community website. It doesn’t have many ‘community’ features as yet but that’s all to come. One thing it does now have is an HTML version of all of our product documentation, in an easily searchable format.

It’s no coincidence that it looks very much like the Author-it Knowledge Center as it too was built using Author-it (alas I can’t show you our website as it requires a login).

This new format of the product documentation is largely to move us away from PDF only documentation. At present we still have a set of PDFs but they aren’t particularly usable.

We ran a few quiet trials of the knowledge centre format and everyone who saw it liked it, particularly the fact you can search across every piece of information offered.

So I was definitely pleased when, after sending out a company-wide announcement about the new version of the website, highlighting some of the new features, one of the first pieces of feedback I received was about the knowledge centre and how good it was. For the, as the kids say, win!

The knowledge centre will be updated on a regular basis, so my next challenge is to figure out a way to embed RSS notifications for new/updated topics. But so far so good, and with Google Analytics in place in the knowledge centre, we can continue to make improvements to both the information itself and in making sure it is accessible.

It’ll be interesting to see how the knowledge centre is used, particularly if we manage to track it against the number of incoming support calls as the main reason we are adopting this format of information is because, many times, the answers are there, they just weren’t that easy to find.

Comments closed

Tech

A boring post about website statistics follows. Feel free to scroll on down to the next post which may, or may not be more entertaining. What? You want a LINK to the next post? You lazy bugger…

Last month (or was it the month before?) I asked you for recommendations for a new stats package for this site. I had used Extreme Tracker and SiteMeter for a while and had always got inconsistent results, timeouts and generally have been unimpressed. Now, whilst I’ll happily admit to being a bit of a stats whore, long gone are the days when I care how many hits I get, it’s much more fun seeing where you all come from. That’s not to say the numbers aren’t useful… and they are most certainly welcomed.

As an aside, one of the casualties in my constant hunt for a decent stats package, not to mention my all too frequent changing of hosts (from … umm… something to LineOne, to Telewest, to 1and1, and now currently with 34sp) coupled with my non-existent archiving notions means that I have no earthly idea how many people, in total, have visited my website since it made it’s thunderous arrival on the interweb… ok, it was a tiny squeak, barely a ripple, a trend which continues today.

Taking some of your suggestions I have been running with StatCounter, Google Analytics, MeasureMap and MyBlogLog for a while now. Add in my hosts own stats and I’ve got far too much information to access and process, so let the cull begin! But first a little comparison to see which one best suits my needs.

Accuracy
Roll back to the first week in March (actually from the 27th February to the 4th of March) and here are the numbers each stats package gave me:

Package Unique Hits Returning visitors
StatCounter 1665 549
Google Analytics 1278 627
MyBlogLog 1144 NA
MeasureMap 989 ~445

I’m not including the stats offered by my host as they are HUGELY different and the terminology is a bit cack and I can’t quite figure out how to analyse it.. not that important anyway as I don’t need to do anything to collect those, they’re just there.

As you can see, the numbers vary quite a bit, and whilst Google Analytics and MyBlogLog are close enough on the Unique Hits, the difference between them and MeasureMap is pronounced, doubly so when you look at what StatCounter thinks.

Opinion
It’s one thing being smart enough to collate data, but it seems it’s quite another thing altogether to be smart enough to display that data in a meaningful, easy to read way. I could spend hours deconstructing each package but I just don’t have the time, nor the energy. Suffice to say that:

  • StatCounter isn’t too bad, uses real english, and only suffers because it isn’t completely free (my, what a world we live in)
  • Google Analytics is awful. Slow to update, a complete bear to use and far too complicated for the likes of me. To be fair though, it’s not AIMED at the likes of me (although that’s no excuse for shoddy UI and meaningless terminology).
  • MyBlogLog strictly speaking this isn’t really a stats package per se. It’s primary aim is to let you see which links people have used to leave your site. And it does a bang up job of doing just that. Recommended.
  • MeasureMap – it’s clean, colourful and simple. Too simple really as it lacks weekly and monthly views, crucial if you want to see trends. But that should be balanced against the fact it is soft on the eye and easy to use.
  • 34sp Stats are achingly complete. Alas they suffer from meaningless terminology syndrome making all that data practically useless. Unless, of course, I am reading it correctly, and I DID receive over 20,000 visitors in the first week of March.

I should mention that MeasureMap was recently bought by Google, meaning that either Google Analytics will benefit from having Mr. Veen on board, or MeasureMap will benefit from having the backing of Google. Or both as both products are aimed at different markets.

Anyway, based on the above, I’ve dropped Google Analytics. StatCounter has become my daily stat check location, and I know that my 34sp Stats are churning away in the background if I want a really detailed look at things (I’ve not looked at them since before Xmas mind you). MeasureMap I’ll stick with for a while and see what influence Google brings, and MyBlogLog keeps on doing exactly what it says on the tin.

Sorted.

Until I spotted Performancing Metrics (which does look pretty good). Back to the drawing board?

Tech Work

Comments closed