DITA is not the answer

Single sourcing is good, I’m sure most of us can agree on that, but I’ve recently been wondering if perhaps DITA isn’t quite good enough?

The thing is, I’ve been looking at DITA as a solution for our single sources needs for a while now. I’ve attended conferences, read whitepapers, listened to vendors and everything else that goes with it and I’ve got a pretty good handle on things. If applied correctly the benefits you can gain are very large, although the same can be said of any other single source project, yet what seems to be consistently missing during all of these wonderfully theoretical discussions is the cost and impact of getting such a solution “applied correctly”.

A key part of planning to move to single source, of which DITA is only a part, is understanding the business needs and technological requirements of all of the content producers in your organisation. Traditionally that means Technical Communications, Training, Pre-Sales and Marketing, with perhaps other flavours to be considered depending on how your company is structured.

However, if those parts of your organisation aren’t yet ready to move, then the business case changes. At present this is the situation I’m in, so I find myself looking for a short-term (2-3 year) solution that won’t lock us in to a proprietary format and that can give us benefits as soon as possible.

Re-use is our main reason for moving to single source. We don’t (yet) localise, and there is only one other team that has any interest in re-using our content (and even then, they are likely to use it as an source of verification, not a source of content). With that in mind, and with the proviso that I’ve used it previously, we are looking at AuthorIT.

Yes it does mean we forego a lot of the power of DITA but as it will allow us to tag topics accordingly (in keeping with the DITA model) and it does have an XML DITA output option, then it shouldn’t lock us in. I’m willing to put up with a little pain further down the road to get the benefits now.

I’m still not entirely sure what else we are missing. We publish PDFs, HTML and Javahelp, all of which AuthorIT handles, and as yet we don’t have a need to dynamically publish information based on metadata. If that changes in the near future then we’ll handle it appropriately but it isn’t on anyone’s radar.

I am concerned about the versioning capabilities of AuthorIT as we maintain the last 3 versions of all our publications, but I know there are ways to achieve this in AuthorIT. I doubt it will work as well as our current system (FrameMaker files in an SVN repository) but, as is always the case, I do expect we may need to make some compromises to get ourselves moving towards single sourcing our publications. This is our main pain point and so becomes the focus for any possible solution.

DITA remains the long-term goal but, and I’ve said this before, until there is an all in one solution that is easy to rollout it remains marginalised as a viable option. Most of us need to produce some form of business case to justify such purchases and, at present, DITA is still too costly an option. I’m always happy to learn new things, and whilst I would love to be able to invest time and resource into creating and maintaining a DITA based solution, I just can’t justify it.

All of my research suggests that, rather than being a simple installation and conversion process, creating a DITA solution requires a lot of technical know-how and a not insubstantial amount of time and resource. We can handle the first, the latter is (I believe) not yet at a level which makes it cost-effective.

Ultimately, for the moment, DITA costs too much.

Do you agree? Can you prove me wrong? I’d love to hear your thoughts on this, particularly if you have implemented DITA already. I’m keen to hear just how more productive a DITA solution can be if you aren’t involved in localisation. Have you recouped your costs yet?

Perhaps DITA is only really applicable for those with large budgets and the chance to invest heavily upfront. Alas I’m not in such a position. For the moment.

Recently Read

With the TICAD conference last week, a couple of days in my sick bed, and the imminent product release I’m working towards, I’ve not had a lot of time to post here. However, the RSS feeds keep trickling in, so here are a few items that caught my eye over the past couple of weeks.

What Beautiful HTML Code Looks Like
I’m a terrible coder. Which is just as well because I’m not a developer but as I do dabble in HTML and CSS quite frequently (hey, and PHP too), then this is a good reminder for me to develop my own best practises.

Code is Tabbed into Sections: If each section of code is tabbed in once, the structure of the code is much more understandable. Code that is all left-justified is horrific to read and understand.

Includes a neat infographic, downloadable as PDF, which is now pinned beside my desk.

Procedures: The Sacred Cow Blocking the Road
An update on a (yipes) 10 year old article. I don’t think I read it when it was first published but I have read it. Well worth another visit though.

“It takes a surprisingly short amount of time for a user to feel unstuck. When I was a usability consultant, I used to advise clients to put the critical information in the first three words of a sentence.”

IA Deliverables
From Content Surveys, to wireframes, Personas and Use Cases, a brief overview of each is followed by a sample template. Not only a useful resource but a good overview of the typical process an Information Architect will undertake, a lot of which can be adapted to more traditional product documentation.

Collaboration is not a dirty word
Collaboration on content (not documents, even if that is where the content ends up) was a key part of my presentation. It’s good to see the switch from document-centric to info-centric taking place.

I love things being this easy. I love getting (almost) zero emails with attachments. I love not having a hard drive full of Word documents.

DITA Troubleshooting specialization

The Troubleshooting specialization creates a new topic type that is well-suited for problem-solution information.

7 Ways to keep the post-conference buzz
Not long back from a conference myself, I have already done a few of these things (item 3 in particular) but some good ideas here.

Wikis for Documentation?
Steve Manning isn’t sure about using Wikis for Documentation but does think they could be a big hit in another, related area:

Most writers have to guess about their users. Few writers get the opportunity to speak directly with users. Few get any sort of feedback at all. They are left to do their best. How useful would it be to be able to post your document on a Wiki and have users be able to comment topic-by-topic? To see the questions they ask?

I totally agree. All I need to do is figure out how this works within a single source environment, and tackle a few issues around governance and change management and it could be an excellent working model.

And finally…
I’ll be updating the TechComms RSS feeds download soon, so if you think you should be on the list (or even if you aren’t sure whether you are or not) then let me know. It includes all kinds of stuff which is loosely related to Technical Communications, and I’m always on the lookout for more sources of inspiration. Leave a comment if you think of anything.

Thoughts from TICAD 2007

Pre-Conference Dinner
The first day of a conference is always a little awkward, introducing yourself to complete strangers is always fraught with danger. So a ‘mingle’ activity is a nice idea, particularly as the TICAD conference makes some play of the networking opportunities, and the an informal dinner beforehand certainly made the following day a little easier. As it turns out I bumped into someone who worked at a sister company of my previous employers (it’s a whole complicated one company, then two companies, then one company thing), and we traded some names and stories.

Dinner was good, with many laughs and thought provoking conversation the kind of thing you get when you dine with a bunch of smart, friendly people. One topic which cropped up, and naturally I can’t recall why (did I mention the wine?) was Chess Boxing. Yes, that’s right, Chess Boxing. It was new to me as well.

The Conference itself is aimed at TechPubs managers and was celebrating its 10th year. Organised by ITR it has a focus on translation issues but is largely a TechComms focussed day. I’d heard of TICAD before but this was my first time both attending, and speaking at, the conference. The day was split into four sessions, the third comprising of two breakout sessions, the others more standard presentations. I took notes for all of the presentations but skipped the breakout sessions to go over my own presentation one more time.

Convergence in technical communications
The opening session was kicked off by Mark Wheeler of Adobe, and despite being fairly much a product pitch, it did outline Adobe’s thoughts concerning the convergence of various areas, with internal documentation, public documentation, Help systems, Knowledge Bases, training material and Demos all becoming more and more closely linked. All share similar traits, they all rely on high quality content for example, and organisations are beginning to realise the benefits of sharing information across these areas.

Part of the presentation did flummox me somewhat, and whilst it may have be a cool demo feature I do question the reality of usage. The idea presented was that by using embedded content within a document or help system, you could launch a video or “better still” initiate a text chat session or VOIP call to a support operative to help you with your current issue. Now, my belief is that, for that scenario, people want to get OUT of the help a.s.a.p. Why on earth would I want to sit through a video, or talk to someone and have to explain my issue, when all I want to do is get on with my work?

Naturally the focus was on the new Technical Communication Suite and overall it does look like it adds some value and will be of huge benefit to many technical communications teams. But then demos ALWAYS look good, don’t they…

Adapting structured documentation and DITA
When I saw this presentation listed in the agenda I marked it as one to attend. We are currently heading down the DITA path ourselves and Thomas promised to share some of the issues and pitfalls he and his team had come across. His presentation was excellent and hugely informative. A quietly spoken American, who was at our table for dinner, he covered everything I had hoped and more.

He covered the guidelines they had to put in place for help the writers cope with the move to structured authoring, including their 5 Glorious Principles (and yes I will be ‘borrowing’ this idea), namely that when writing topics:

  1. Standalone chunking – create discrete chunks that contain only information about the topic/type.
  2. Labelling – Titles are explicit, describe the topic (this also stops conceptual phrases like “this section contains” and so on).
  3. Relevance – the content matches the topic.
  4. Consistency – topics are written in the same way.
  5. Reuse – topics are written once and can be used many times.

Working in a large organisation they found they had to hire a dedicated Documentation Product Manager, to coordinate and liase with Technical Publications, Training, Marketing and all other information creators. They also hired a dedicated architect to manage their DTD.

Outlining the drivers for their change, with localisation being the biggest (numerical) business reason, he talked through the planning stages, and admitted that they decided to stick to topic-level reuse rather than ‘conref’ level reuse (in theory you can reuse any single element, so a paragraph or list can be used in multiple topics) although that is something they are currently addressing. As a path to ease the pain of migration it is likely we will do the same, so it’s good to hear others taking the same route.

Technical English made simple
I wasn’t too sure what to expect from this presentation, but was pleasantly surprised. Admittedly as it was focussed more on maintenance style procedures, for hardware, then the suggestions didn’t always apply to a more software oriented team writing (or moving towards writing) in a task based style, there were still many valid points to take home.

Amongst the commonly held truisms, such as writing with an active rather than passive voice, Maria expanded on these topics with several examples, and the basic premise that most technical documentation is easier to read, less ambiguous, and easier to translate, if you simply consider each sentence and make sure you are assigning the task to the reader.

At present we don’t translate our docmentation but I am more than aware that someday, soon, we will be asked to do so. Some of the suggestions made by Maria will form part of new guidelines as we adapted our writing style to be more translation friendly.

In-country translations
Helen Eckersley, of iTR, gave a presentation which I didn’t think I’d take that much from. Focussing on getting the most from the people who review translated material it largely followed general practise for making the most of any kind of review, technical, linguistic or otherwise.

However, as it the way of things, it’s always good to get a reminder of such things, and similarly to Maria’s presentation I did gleen some information that, if put in place, should make translation of our documentation a whole lot easier.

Helen touched on linguistic assets, containing glossaries of approved terms (cross-language), translation memories, style rules for acronyms, product names and so on. All things we can consider now and start to build ourselves.

Using Wikis for Collaborative Authoring
Some Scottish bloke stood up and waffled on about Wikis, illicited the odd smile and largely left everyone bemused.

Vision of the future
The final speaker was Bernard Aschwanden, who I saw present at X-Pubs earlier this year. He is an animated, charismatic and passionate speaker and was given somewhat of a free reign to pull together his Vision of the Future.

He opened with a video, one which I think I’ve linked to before and which still bears repeat viewing.

Frankly the video is enough to get the synapses firing but building on that, Bernad took us back through the history of Publishing, from the first clay tablets, past the Guttenberg Bible all the way to Playboy. He tracked back through the advances in the past 100 years of technology, and then headed into the future.

Breaking things down into two sections, the first of which dealt with the coming 5-10 years, Bernard offered his take on where the traditional Publishing processes would take us. The basic premise is the same, regardless of the timescale, but the way in which information is handled and managed will change. For example, at present we spend a lot of time fudging with DTP packages to get information into a form that is legibile for our readers, in the next 5-10 years that will no longer be an issue (it’s already not an issue for some people publishing from a CMS system, where the template is applied and any layout errors automatically dealt with by the software.

He then tackled 25-100 years and whilst at first some of his premises seemed laughable – pulling the uploading of information from the movie The Matrix for example – he quickly reminded us of the change in technology in the last 100 years.

However, one thing remains true and becomes crucial in the future. All of the sources of knowledge really on people to check and validate the information on which it is built. Those people are the technical authors of today and in 10, 25 or 100 years from now, we will be in a far more powerful position than we are today. Bear that in mind the next time you ask for a raise!

All in all a fascinating presentation which I’m not doing justice. If you ever get the chance to see Bernard speak, do so. You can always tell when people are passionate about something, and he also has the knowledge to back that up.

My final thoughts
Sitting, as I am, on the train on the way home, it’s easy to pontificate about the things I’ve learned. Everyone returns to work after such an event, with a little extra enthusiasm and grand plans for change. However this time I do genuinely feel that there are things I will take from this conference that I WILL put into action, some of them require little extra work but can have huge benefits, others will need more contemplation but are equally valid.

The conference was very slick and well organised, credit to Tanya, Sally and all the other guys and gals from ITR, they certainly made it a very relaxing experience for me, very much appreciated as it was my first time as a speaker.

If you are a team lead, a manager, or have any sort of big picture thinking about Technical Communications then I highly recommend you head along to TICAD next year, you’ll find something of interest without doubt.

Hopefully I’ll see you there.

Recently Read

I’m not sure if this is lazy blogging, or a useful feature but here, again, are a few of the articles and blog posts that caught my eye over the past week.

  • Audrey Carr on Design Briefs ~ “After a series of small projects involving undefined requirements, fuzzy objectives, and an aparent lack of truly insightful customer insights, I’ve spent a good chunk of this week thinking about the strategist’s favourite tool for communicating with creative and account teams: the brief.”
  • Inline Linking is bad for usability – whilst it’s aimed at those people concerned with optimising their content for search engines, the examples are interesting from a writing point of view as well, swap out “usability” for “readability” perhaps.
  • Linking DITA Topics using Relationship Tables – If you are investigating DITA have a skim through this, yet another way to increase the power of DITA
  • A mile wide and 30 seconds deep ~ Mike Hughes on how to focus your help content, “…focus the Help on what it does well, that is, provide a wide variety of 30-second informational solutions to get most users back on track and in task”
  • Help Authoring Tool matrix – has been running for a few years now and is a bang up-to-date, excellent resource.
  • The Rockley Blog – Ann Rockley and Steve Manning, of the Rockley Group and that book (Managing Enterprise Content: A Unified Content Strategy), have a blog. I had dinner with Steve, and others, after the X-Pubs conference this year and this is definitely a blog to watch.

OK, that’ll do for now.

Hopefully, by this time next week, I’ll have finished the new design for this site and will start posting a little more regularly. There are a few things I would like to explore, and hopefully I can get some feedback from you, dearest reader. Make sure you subscribe to the RSS feed to catch the next post.


Still in Reading, travelling home tomorrow and apologies to those who offered to meet up, I’ve barely left the hotel but have had two cracking nights with some very smart people, some of whom have told me some of the dirtiest jokes I’ve heard for a while.

And no, of course I can’t remember them… I’m not a tee-totaller you know…

Anyway, I’ll be writing my thoughts up on my other blog if you are interested.

Ohh and I’ll hopefully get my gordonmclean.co.uk domain sorted out… and a myriad of other things that haven’t budged an inch… mind you I’ve got a few hours on the train tomorrow so I might at the very least get my notes typed up.

I can tell you are all just FASCINATED, let’s move on, shall we?

To everyone who responded to my request and left a comment, many thanks. It’s been food for thought, as well as offering up some new blogs, always a good thing. More on that later of course, you know how I love to over-analyse…

Ohhh, the late nights, booze and 5-star food is kicking in, time for bed methinks. Come back soon and I’ll regale you with tales of a hugely uneventful train journey, and the joys of conference freebies.

Or maybe not.

* Bit of a in-joke this one, a mix of DITA and the ‘chicken chicken’ presentation.

X-Pubs Conference

Just about finished at this years conference and, as ever, I feel fired up to get back to the office and get things moving. Overall the main theme of the conference was preparation, preparation, preparation, mainly focussed around gathering requirements before kicking off a project. Nothing special there but if you are considering moving towards a single source environment, there is a LOT of preparatory work you’ll need to consider.

I’ll amend this post tomorrow with some notes and thoughts from some of the sessions, but overall I’d highly recommend you visit X-Pubs next year. What follows is largely compiled from scribbled notes and random thoughts, but hopefully may be of interest. I’m not sure if copies of all the slides will be available on the X-Pubs website at any point, I certainly hope so.
Read More

And so, it begins

Please excuse the dust, and mind your feet, I’m still tidying up. I was always told you should finish a website before launching it but, in the days of instant gratification that advice seems somewhat stilted and old-fashioned. So here it is, yes, it’s another blog.

I’ve been blogging for many years now but this is my first attempt at writing a professional blog. To make it a little bit easier on myself I’ve chosen an area in which I’m fairly well-versed – Technical Communications. I have been a Technical Writer/Author/Communicator (I’ll cover that issue another day) for over 10 years and have worked in a variety of different environments, for a variety of different companies, with different cultures and different technologies. I’ve got various articles and whitepapers written up, but largely un-published, and these days if you don’t have a blog… then you are probably out doing something more interesting!

I’ll be covering every facet of Technical Communications that I’ve stumbled across, although I’ll be steering away from discussions on grammar, spelling and english usage (others already cover that in far better style than I could).

So what will I be covering? Everything from planning and designing documents, user analysis, manipulating graphics, DITA, working on the web, document design, AuthorIT, content mapping, agile development, review processes, using the documentation, web design, writing, modular documentation, FrameMaker, editing, CSS, indexing, task analysis, single source, content management, minimalism and much more that I’ve not thought of yet.

I’m not entirely sure where this blog will take me, where it’ll end up, but it’s first step towards a bigger picture and the fruition of many years of trying to have a “professional” place on the web. Comments and discussions are encouraged, and I most certainly do not promise to always be correct. Like most people I’m still learning and trying to keep up as the scope of my profession expands and contracts, and like most I’m sure the internet will continue to play a large part in that process.

Now, if you’ll excuse me, I’ve just got another lick of paint to apply…

Getting there

Limping towards the finish line that will be the ‘members only’ launch of the Scottish Blogs website. Just need to get ONE DAMN FORM working, import some data and ‘stick a fork in me’ I’m done. I’ll let the current members play with it for a bit (which means I’ll be blocking it off temporarily with a password… maybe.. if I can be bothered) and allow them to edit their details and then the site goes live.

There are still quite a few features I want to add but I can add them later without a hit on the current users and I really need to get this off my ‘list’.

In other news (ohh hang on, can’t mention that) errr… ohh yes.


To be honest I’ve not really been happy with this site for a while, and I think the last re-design was just an effort to try and divert my attention from that fact. The next one is likely to be more radical. No timelines. No collaboration. It’ll just happen one day.

In real life: Ehhhh well… after the weekend I’m actually enjoying doing bugger all to be honest. Especially as my head is full of DTDs, EDDs, DITA, content audits, information maps and the like, it’s nice to kill a few braincells by surfing aimlessly for a while.