X-Pubs Conference

Just about finished at this years conference and, as ever, I feel fired up to get back to the office and get things moving. Overall the main theme of the conference was preparation, preparation, preparation, mainly focussed around gathering requirements before kicking off a project. Nothing special there but if you are considering moving towards a single source environment, there is a LOT of preparatory work you’ll need to consider.

I’ll amend this post tomorrow with some notes and thoughts from some of the sessions, but overall I’d highly recommend you visit X-Pubs next year. What follows is largely compiled from scribbled notes and random thoughts, but hopefully may be of interest. I’m not sure if copies of all the slides will be available on the X-Pubs website at any point, I certainly hope so.

4/6/7 – Monday

KEYNOTE: DITA in Context
Michael Priestley
Discussed DITA as it currently stands, outlining the core principles behind the design of the standard – Usability (for the people creating the content to the standard), Reusability, Specialization and Collaboration. Talking through various concepts, ditamaps and so on, he covered the use of DITA in the real world, and looked forward to the how it was going to evolve in the future. It was an interesting look at how a standards committee comes about, and I’d suggest it was a good example of ‘standards committee best practise’.

Adobe Sneak Peeks
Karl Matthews
A look at the Adobe product suite, including some new features and a look at a fully ‘adobe’ integrated system. Nothing really new… Flex looks interesting, as does Adobe 3D, but for the most part it wasn’t anything you can’t already glean from the Adobe website.

XML and Content Management: Simple Ideas and Practical Solutions
Harvey Greenberg
Interesting presentation to a point but ultimately repeating some of the points already heard elsewhere. If you were familiar with the basic arguments behind single source solutions then this was interesting but not hugely informative. The requirement gathering stage was stressed, and some of the example solutions presented gave a small insight into the decision making process. After all DITA will not work for everyone.

Successful System Integration: How XML Affected Bombardier
John Straw
Simple, and straightforward presentation of how one company tackled different internal and external issues by implementing a custom XML based single source solution. Some good advice presented, particularly focussed on the planning stages were to make sure the basic expectations were set. Many technical publications teams will not be able to have a dedicated team working on a custom solution, so part of the business plan has to include an agreement to honour current commitments. Seems obvious I know, but best to get it in early and plan for it. Ultimately, good requirements gathering and communication with all interested parties were the key lessons to be taken from their experience.

Procedural Change = Cultural Change
Emma Hamer
A non-technical presentation but neatly encapsulated a lot of the ‘soft’ issues facing a change in working methodologies, across an organisation, and some tips on countering them. Straight talking advice.
“Quantum Communications” – dynamically building content – docs don’t exist at all until you choose to build them. Example: Personalised content of a website, different views for different people based on a profile.

Workshop: The Road to CM & XML – Part 1
Steve Manning
Outlined the entire scope of what is required when building the business case for, and the initial stages of, a single source solution. Then again, as co-author of “Managing Enterprise Solutions” he should his know stuff. Covered the basic process they use when engaging customers, and some of the thinking points and common problem areas they encounter. Was fairly highly level introduction, tomorrow we’ll be covering the process in more details.

5/6/7 – Tuesday

Michael Priestley sneaked in and kicked off with short “5 mins” on interoperability initiative at OASIS.
Basic premise is to investigate re-use across differing standards, so regardless of where the topic was created, it can be reused in systems using other XML Standards. For example, I could re-use a DITA topic in a DocBook chapter..
Scope of the XML Document Standards Interoperability Committee (I THINK that’s what it was called but it’s still to be agreed) is to possibly drive changes back to the standards, and think about the semantics that will need to be in place at a standards level, as well as the possibility of some constraints to maintain the operability.

KEYNOTE: Lessons learned on the cutting edge: a large scale CMS and its ROI
Steve Manning
Drove home the planning and requirements stage. The need to understand the processes and the need for change before deciding on ANY aspect of a unified content strategy is key. DITA may not be the standard for us to use, nor may DocBook, and perhaps what we already have is almost good enough. Decide what is fit for purpose and make sure you have gathered requirements from all areas that may be affected, remembering that additional opportunities for re-use will likely surface as you start talking about requirements with other people.

DITA and structured authoring
Bernard Aschwanden
Considerations
– The solution includes CMS (database), tools (FM), and processes.
– Make sure the CMS has a way out
– Information models re-usable any requirement for interoperability between standards? What is there in the training arena that we need to consider?

Tooling
– How much value in keeping FM as the tool? Will the tool allow content to be shared out to, and back from InDesign and Word?
– XMetaL? Which CMS? SVN? IBM use sourcesafe and their ROI shows that… but FINDING info will be the issue
– CMS – keep generic, not focussed – you can’t use something if you can’t find it (metadata handling)

X-Pubs Question Time
Submitted question: Do the panel have any experience with Agile development companies and do they agree that DITA offers an easy way to map authoring to software development?
Basically, yes. The way DITA topics are structured means that chunks are small, and pubs teams can be very reactive to change. The structure of DITA makes it fairly easy to ‘guide’ authors through the creation process alongside development.

Other questions focussed on S1000D, and aerospace specific issues. Bit of a hit or miss session but was useful. No fights though, boring! (I later suggested that someone should’ve been given the task of being devils advocate, as long as this is pointed out at the end of the session can work quite well, but for that to work you need someone ‘new’ that isn’t that well know).

Brave New World: how one writing team moved to DITA-based authoring
Helen O’Shea
Part of the IBM team that were the original DITA guinea pigs.
There are many considerations, most non-technical, when moving to a topic based authoring structure. The fact that authors will now need to think in terms of self-contained topics, rather than entire chapters or books, is a mind shift which needs managed (training will help)
Helen pointed out that one easy way to help authors overcome the loss of ownership was to provide scheduled builds, to allow them to see where their content is going.
When writing topics, the quality of the content needs to be much higher than that of books. As content can be re-used, guidelines and writing styles are key.
IBM assigned a dedicated Information Architect to the project, she handled the mapping and planning of the content structure, and had the ‘big picture’ of the product and the documentation.
Weekly ‘surgeries’ were held to make sure any issues or problems were ironed out – these would just be part of our weekly meeting.
Used SourceSafe for basic version control but are moving to CMS because it offers searching for content.

Product Documentation – Revealing the secrets of single source
Agneta Weisberg
Talked about her companies diverse authoring teams and at a high level outlined the decisions and justifications for moving to single source. Final ROI figures were so startlingly good that she actually adjusted them in the fear that no-one would believe them: “In no other industry would you get the chance to do SO much more with the same cost”.

On the way out Noz (a Mekon consultant) noted that creating a UCS is a unique issue that no other department usually faces. Typically the reaction to ‘XML Publishing’ or ‘single sourcing documentation’ makes everyone immediately think of ‘tech pubs, not us’. However the real advantages to a business, across the enterprise need to be considered and ‘sold’. No other department has had anything like this to contend with, and the paradigm shift it entails is one reason it scares some businesses off, or causes them to departmentalise it (this losing a key part of the ROI!)

Workshop: The Road to CM & XML – Part 2
Steve Manning
Content analysis and planning stages were covered. The key aim is to make sure the right content, is given to the right user, at the right time in the right context. Analysing the content at first a structural level (TOC for example) allows you to understand the structure and the common items. In-depth analysis then looks at the content to see if it is actually re-usable, and looks at the style as well (it may be possible to rewrite in a different style to aid re-use). The example being: The TOC for two manuals says they both have a Preface. So there is a high-level, structural, re-use opportunity. In-depth analysis may then show that yes, both Prefaces are essentially the same (at least they are not different for a valid reason, just due to differing authors), or that no, both Prefaces are different for valid reasons (one may include legal notices because that product line is being sold into a chemical company).

Remember that conditional processing IS a form of re-use. Chunks may be the level we aim for but some words may need to be ‘re-used’ based on certain conditions.

Yes, content analysis is best conducted on a big table with printouts of the documents.

Levels of granularity need to be considered, things like system data (dates etc) can be completely granular because then the ‘machine’ can worry about it. Anything that can be calculated can be, should be, handled by the machines.

Two things to consider when considering structured authoring:
1. It SHOULD help control the input, helping the authoring process
2. It SHOULD help you manipulate content for output, use the machines to do the grunt work stuff.

Thoughts

To summarise, the main focus reflected across most of the speakers, was the necessity of properly gathering the requirements before thinking about how to implement an XML based single source solution. As the solution encompasses not only tools, but processes, standards and people, there is much to consider. I’m hoping to write up a whitepaper or article on my experience as I move through this process, but if you have any specific questions, let me know.

One comment

Comments are closed.