Summary: I'm looking for a way for reviewers to comment collaboratively as close to "inline" as possible on a large HTML project.
The problem in detail
I work on a team that documents a large product. The HTML documentation set has hundreds of individual pages (with sidebar hierarchical table of contents, as you would expect); a PDF of the entire doc set is 5000+ pages.
When we document a new feature or make broad improvements like reorganizations, we publish an HTML build for review by developers, QA, support, the product manager, and other writers. We publish the entire doc set in this build, not just the changed pages, because sometimes context matters. To mitigate that, we provide links to the specific topics that changed. We actually add these links to the doc plan, a specification that we produced earlier describing the intended changes -- this way people can, if they want, see the background of why we made a particular change. The doc plan is a page on the internal wiki.
Right now, when we send out a review request, we point people to that wiki page, and ask people to post their feedback as comments. This allows everybody to see each other's feedback, which means (a) less repetition compared to individual responses and (b) earlier discovery of disagreements among reviewers. But long comment chains can be hard to navigate too, even with threading. And people still have to do some extra work to write those comments, because they have to tell us what they're reacting to. Typical comments begin with something like "in 'Installing Plugins', the description in the third paragraph isn't quite right because...".
This approach works better for us than either email responses or individually commenting on PDFs of just the selected topics. (We've done both of those.) Is there a way to make it even easier by allowing people to attach comments right there in the HTML, kind of like commenting on Google Docs, but without having to import our large doc set into some other tool just for this purpose? Or is the current approach the best we can do without a lot of extra work?
We want to make it easy for people to comment and see others' comments. The bar to beat is comments on a wiki page. We aren't interested in importing a large HTML doc set into some other tool (that people would have to learn). I'm wondering if there's, say, some Javascript package out there already that we can inject into these builds to support this goal, or some other way to achieve this goal.
Tools in use
We use source control (git), with feature work being done on branches. The review builds are produced from those branches and are persistent. (Most of our reviewers are not comfortable reviewing the HTML source, or I would sidestep all of this by having them review the raw source on the branch.)
We use Madcap Flare to create and build the docs. Flare's schema for the doc source is an extended HTML; all HTML is valid, plus they add some tool-specific tags that are used at build time. The output is conventional HTML.
We use Jenkins to manage the build process. Jenkins currently calls a script that does some housekeeping and invokes madbuild.exe (Flare's build engine). That script publishes the HTML on an internal server. In principle, therefore, we could modify the build script to inject something extra into the output just for these branch builds. We own the server, so we can add things to it if needed (like a way to store comments).
Answer
On one project I worked on, we did reviews via a work- in-progress server, which was an HTML version of the current state of the docs. We created a modified build script for this server which included the following:
- A status indicator for each topic (ready to review, draft, final, etc.)
- An ID for each topic.
- Paragraph numbers in each topic.
- An instruction to raise any issues found, in review or otherwise, in the issue tracking system using the topic ID and paragraph number.
This was relatively low tech. Commenting did not happen in the docs interface itself. But it seemed to work well. Reviewers had an easy way to indicate what their review comments applied to. I think they tended to review with a text editor window open and made comments by paragraph number, then pasted it into the error tracker. These were all operations they were well used to doing, so there was no learning curve or unfamiliar tools to use.
The work in progress server was live all the time that the docs were being developed, with appropriate status notifications on each topic. We found that a number of people in the organization found it useful to have this information available during development and we occasionally got feedback outside the formal review process.
No comments:
Post a Comment