bookmark_borderThoughts on HATT Survey thoughts

Tom Johnson has had a look at the survey recently published by the HATT matrix website on help authoring and, by pulling in the results of some other surveys in the same area, has extrapolated some good conclusions from them.

He rightly points out that surveys need to be taken with a pinch of salt (he goes into the detail of why this is so), and that whilst the numbers involved would seem to be high enough it’s likely that the questions themselves need further consideration in future.

That said, there are two things I took from his post.

1. Know the problem before picking the tool
You may not be in the position to switch authoring tools, but if you are and you have investigated the market then please make sure that you are buying a tool that addresses the problems you have.

The presumption here is that if you have a legacy tool (like we currently do, FrameMaker 7.1) and it still works and meets your requirements then there is no good reason to upgrade. I’ve been victim of buying into the ‘keeping up’ frenzy that software manufacturers like to generate but once a product is reasonably mature it is likely it has most of the features you need already.

I’d offer Microsoft Word as an example here, I could probably still use Word 2.0 for the few documents I maintain in that format as the newer versions add functionality I don’t need (and which has ended up intruding on my workflow at times!).

The X-Pubs conference a couple of years ago had a common, if not publicised theme. Almost all of the presentations included the advice to figure out what problems you had, before deciding IF single sourcing (using XML as the base format) will help and that’s even before you consider the tools themselves.

2. DITA is still a theory
Whilst it is true that the number of people leveraging DITA for their content is rising, the numbers remain low.

Partly that will be due to the fact that few organisations/teams/people are in a position to quickly switch just because a new technology has come along, but and I’ve said this before (in fact I’ve said that I’ve said this before!) rollout of DITA remains harder than rolling out a bespoke authoring tool.

When costing an implementation of a new tool there are various factors and it’s very easy to see that you can get MadCap Flare up and running quickly, where as a DITA based solution takes investment in developing the environment. This is beginning to change but, as yet, the phrase ‘DITA support’ really only means that you can output to a DITA formatted XML file. The tools aren’t constructed around the DITA concepts, so you immediately lose a lot of the benefits that DITA can bring.

Until there is a tool that fully leverages DITA, building it into the workflow of using the tool, and helping the concepts become part of your daily working practice then it will continue to be a marginal player.

Which, in a way, is how it should be. DITA is not a tool, it is a technology and methodology. It is there to support the toolset and the writer. It’s just a shame that tool vendors continue to believe that THEIR format is best, refusing to budge from that position and shoe-horning ‘DITA-esque’ features into their software.

Anyway, the rest of the survey write up is interesting and worth a read but, as Tom says:

“I do love these surveys, though; if for no other reason than they give us something to talk about”

bookmark_borderBack to DITA?

I’ve mentioned DITA a few times on this blog, and my DITA is not the answer post is still attracting attention. As I’ve said, I think the DITA standard is an excellent one for software documentation and the DITA movement is slowly catching up to the hype. I’ve never given up on DITA and had always planned to use it as the basis for the next stage of our content development, and as it happens the switch to a full DITA/CMS based solution may be closer than I had anticipated.

We have been considering how best to publish up to date information in keeping with patches and minor releases, and if we can tidy up and publish useful information from our internal Wikis and support system. The nature of the product we work with means there are a lot of different usage patterns, not all of which we would document as they fall outwith typical (common) usage.

So, how to publish formal product documentation, in-line with three versions of the product, in PDF for ‘printed’ manuals, JavaHelp to be added to our product, and HTML to be published to a live website alongside other technical content (ideally maintained in the same system as the product documentation). Storing the content as XML chunks also allows us to further re-use the content programmatically (which can be tied into our product in a smarter, dynamic, fashion).

The obvious answer is single source using DITA to structure the content, storing the content as XML to give us the greatest potential avenues for re-use. Nothing particularly startling there I know, but it’s a switch from the direction we had been considering. So I’ve been catching up on what’s new in DITA-land and have to admit I’m a little disappointed.

We already have FrameMaker and Webworks in-house, although both are a couple of versions old, and thinking we might keep using those applications I’ve been hunting about to see if I can find a solution that offers a coherent, end-to-end, story. There are several CMS solutions which require an editor, editing solutions which require a CMS, and a few products that straddle both CMS and editing but then require publishing engines.

I understand that it would take a collaboration between vendors to be able to offer a simple, seamless solution

In addition to that there does seem to be a tendency for any DITA focused solution to remain the remit of the overly technical. Don’t get me wrong, I’m quite happy delving into XML code, hacking elements, or running command line scripts to get things done. But surely I shouldn’t have to resort such things? Now, I’m sure there are many vendors who will tell me that I don’t need to worry, but I’ve seen several demos and all of them miss a part of the FULL story.

Come on then vendors, stick your necks out. If you are a CMS provider, then recommend an editor. If you sell editing software then talk nice to a CMS vendor and start promoting each other (yeah Adobe, I’m looking at you!).

And yes, I’ll happily admit that maybe I’m just not looking closely enough. If only there was some sort of technical community website that I could join, perhaps with a group or two on DITA? That’d be great.

Ohhh wait. There is! (not the most subtle plug in the world, was it? I think the new Content Wrangler communities could be a big hit, do check them out).

Have a got the wrong end of the stick, are there really gaps in the market in this area at present or is it just my imagination? I guess I’ll be running a fair few evaluations over the coming few weeks and, of course, I’ll post my thoughts and findings here.

bookmark_borderRecently Read

Another week (and a bit) has passed. Time is tight for me at the moment, and I’m not posting here as often as I’d like so, for now, here’s a quick roundup of everything that’s zipped past my inbox in the past week:

Resources on presentation design

… advocates an assertion-evidence design, which serves presentations that have the purpose of informing and persuading audiences about technical content

Needless to say, with my first ever conference presentation looming, I’m fairly focussed on both topic relevant stuff and anything that will help make my presentation better.

An XML CMS is simple as 1-2-3

Creating an XML-based Content Management System to single-source technical publications is as simple as 1 – 2 – 3. Rather than focusing on any single tool or solution (and thereby forcing users to change to match the tool), this article describes one possible three-step process for using XML to single source your content deliverables.

A rather simplistic view of things, but if you are a bit flummoxed by the raft of information available in this area, and you aren’t really sure where to start, have a quick look at this. Short, simple and easy as A-B-C.

Make your writing east to scan

… the acid test is looking at your information as your reader/user would see it, and asking yourself “can I find what’s important without reading the whole page”?

You can format your text in a variety of ways, but it pays to take a step back and view the format of your content from the point of view of your readers.

Getting to the web-first mentality
Interesting to read that other professions are struggling to embrace the internet. Ultimately I think it’s getting to the point where we just need to take the plunge.

Start putting the web content management system into the workflow at the front end. This could be as simple as using Google Docs as a word processor instead of the bloatware that we know as MS Word.

Collaborate or fail
Titled “Building a Collaborative Team Environment” the opening couple of paragraphs kept me reading:

Technical communicators work hard to meet deadlines and value the standards inherent in the profession. At the same time, they value their personal creativity and the responsibility for developing a complete publication on their own. They tend to enjoy doing everything from writing, editing and page layout, tographics, technical content, and more.

Working as part of a team to create a single set of deliverables, handing over responsibilities to fellow team members, and trusting the work produced by others does not come naturally.

It’s an excellent article, looking at a variety of ways in which we, as technical communicators, can adapt how we work. It will no doubt prompt some posts here as I digest it further.

And on that, somewhat culinary note, I’ll thank you once again for stopping by.

bookmark_borderExplosions: keeping ahead of the blast

Is it just me, or are we seeing a notable growth in the tools and voices linked to our profession? Are we, the technical communicators (writers, authors, designers, whatever..) finally clued in to the internet and making the best use of the global space? Are the tools we use starting to touch other areas of our organisations, thus raising our profile, which raises the bar for the tools, which expands the reach, which raises the profile…

It’s just me, isn’t it?

I’ll happily admit that, a couple of years ago, I was growing apathetic with this industry. I dreamt of working in a zoo, tending to cute fluffy animals and having nary a worry in the world (and likely not enough money to pay the bills). Since starting a new job in January this year I’ve rediscovered my vigour and enthusiasm, and that seems to have been matched by the tool vendors. I would also try and lay claim to the growth in technical communications focussed blogs and websites but that’s a little generous of me I fear.

FrameMaker has launched a new version and a new suite, AuthorIT has launched a new version, MadCap blazed onto the scene (geddit) and ruffled some feathers, and the XML focussed single source arena seems far more active than it was. Now, I’m happy to admit that it may just be that I happen to be more aware of what is going on, but the coincidences are a little too high to ignore.*

Of course what this really means is that, at some point in the near(ish) future, people are going to start to select a tool. The XML guys are reasonably future proofed in that respect for, as they all share a common file format/standard, the choice of tool isn’t the locked in choice it once was. In a way, AuthorIT is in the same boat as they can roundtrip through XML, even though they store their information natively in a database.

But our dear old FrameMaker, despite the new version and a seemingly re-invigorated development team, now sits as the odd one out. When I heard that Adobe had launched a Technical Communications Suite I presumed, instantly, that it would mean “instant single sourcing”. Possibly a simple CMS backend, from which you could pluck topics and edit them in FrameMaker or RoboHelp. At the very least a proper roundtrip between those two tools and, as we now know, we don’t get any of that. In fact Adobe have introduced even tighter coupling between their two applications and I’m still trying to figure out if that is a genius move, or a final throw of the dice.

Regardless of which tool you choose, or which blogs you read, this profession is growing. Links are being established between other groups, and as software continues to increase in complexity the understanding of the need for good documentation is continuing to rise. I’m certainly spending less time explaining both what I do, and why it is needed and that can’t be anything but a good thing.

The ability to self-publish has created millions of “writers”, and an astonishing change to the way people view the written word, in a very short time. Some of those people write about technical issues, indulging themselves by sharing their hobbyist knowledge and, as such, they are both the subject matter experts and the technical writer of their niche.

As a profession, our ability to collate, filter, sort, and organise information, tailoring it for the right audience, providing that information at the right time, in the right place, will be the key differentiator. The playing field is levelling out, but we have some tricks up our sleeve yet.

* I’m deliberately ignoring the HATT arena, if you have any insights there I’d love to hear them.

bookmark_borderX-Pubs Conference

Just about finished at this years conference and, as ever, I feel fired up to get back to the office and get things moving. Overall the main theme of the conference was preparation, preparation, preparation, mainly focussed around gathering requirements before kicking off a project. Nothing special there but if you are considering moving towards a single source environment, there is a LOT of preparatory work you’ll need to consider.

I’ll amend this post tomorrow with some notes and thoughts from some of the sessions, but overall I’d highly recommend you visit X-Pubs next year. What follows is largely compiled from scribbled notes and random thoughts, but hopefully may be of interest. I’m not sure if copies of all the slides will be available on the X-Pubs website at any point, I certainly hope so.
Continue reading “X-Pubs Conference”

bookmark_borderMusings on X-Pubs

Next week I’m heading down to Reading to attend the X-Pubs conference, where I’m hoping to learn more about both how a solid XML publishing solution is implemented and, ultimately, why I need to bother. OK, I know why I need to bother so I guess I should re-phrase that to “how I justify all the effort involved”.

I’ve done my fair share of research in the past, and have previously implemented an AuthorIT based solution. That worked reasonably well for our needs back then, but I’m with a different company now and the needs have, naturally, changed. I know all about the benefits of single sourcing content (re-use), and why XML is the best choice for storing that content (re-factorability, if that’s a word!) but as yet I’ve still not seen the killer product/solution that makes things:

  1. easy to implement – I’ll accept some pain but most of the solutions I’ve seen have a fairly large knowledge mountain to climb
  2. easy to adapt – I have, currently, very specific needs dictated both by the company and our product, and by my the needs of my team.

Of course no such solution exists, or we’d all be using it, right?

And that’s why I’m attending the X-Pubs conference.

I do believe that single sourcing is THE way to go about things in technical communications, almost (implementation costs aside) without reservation for any size of team and almost without regard for what they are producing. Like here in the UK, which is trapped between imperial and metric measurements, the technical communications industry seems to be a mix of differing output requirements. Some audiences demand printed manuals, some want websites full of searchable content, whilst others are happy with PDFs or online help. And that’s before we start worrying about localisation (localization).

I’m not entirely sure what I’ll discover at the conference, and I’m doing my best to keep an open mind, but I truly believe that the power of single sourcing will remain the refuge of the few until someone comes up with a workable, affordable solution that everyone can use.

I’ll be posting from the conference (possible even “live blogging”), so come back on the 4th June to find out what’s going on.

Do you single source? Or have you consider it in the past but never pursued it, and if so, why not?