‹ All posts

Measuring and maintaining documentation quality

How do you know your documentation is good? How do you know if it is answering the most common questions users encounter? Is it easy to navigate and search? Is the language accessible and understandable by the target audience? And how can we ensure the quality stays high?

What is quality?

Let’s start by saying that answering the question “what is good documentation” is very, very hard. You cannot use a single metric to determine the quality of your docs. Just take a look at Tom Johnson’s Quality checklist for API Documentation and see the number of things you should keep in mind when designing your developer portal.

Knowing if your documentation is any good requires technical writing expertise. And more importantly, understanding your user and what they are trying to achieve. A big part of a documentation writer’s job is understanding where users are getting stuck. Then, write documentation to address those specific needs.

Is there anything we can measure?

Sounds like the answer to the questions posed above is leaning towards a “no”. Is there nothing we can do programatically to measure documentation quality?

If we cannot perfectly answer the very human-centric questions about quality, are there indicators that can give us hints about problems with the documentation? Perhaps to guide us and warn us if there are any problems.

I would argue yes, there is. And this is something I’m building into Doctave, although you can apply these techniques to any documentation project. Let’s look at some examples.

Walls of text

If I had more time, I would have written a shorter letter.

Nothing scares off an engineer faster from your documentation site than a 4000-word wall of text. Readers appreciate structure and nice visuals, and won’t read overly long prose. This is something that we can measure and warn about.

1
2
[warning]
This paragraph is over 150-words long. You should consider rewriting it.

You may want to customise the limits in your particular project. Regardless, this is a simple example of a measurement we can make about documentation that tells us something about its quality. Namely, it’s readability.

Imagine you are a user browsing documentation, and you finally find a link that seems to point to the solution you’ve been looking for. You click it, only to be greeted by a supposedly funny 404 page. This is a terrible experience for a reader.

Broken links checking is something that many tools already do for you and are able to save you from mistakes that will frustrate your users. It allows you to refactor your content with confidence, knowing that all links will resolve correctly by the time you deploy your changes.

Checking for broken links is something humans should never spend valuable time doing. We can write programs to do this for us perfectly.

Style guide enforcement

Many teams will adopt official style guides for their documentation to ensure it reads with one common “voice”. There are some well-established style guides, like the Chicago Manual of Style, and teams will often use them as a basis to create an internal style guide. For example, here is DigitalOcean’s style guide.

Common rules you may see are:

  • Avoid words like “just” or “simply”
  • Capitalise names of your product correctly
  • Use the second person to refer to the reader

There are also open source tools like Vale which will enforce these rules for you! You can configure Vale to be part of your CI/CD and verify your style guide is being adhered to. This is especially useful if you are getting contributions from people for whom writing is not their core competence.

Vale CLI screenshot with prose linter errors

Analytics

At some point, you want to look at actual hard numbers to see how your readers are interacting with your documentation. Are they dropping off after only looking at one page? Do they try to search something, only to immediately close the browser, possibly in frustration?

Or maybe you discover that there’s an important piece of documentation that almost never gets read. Perhaps this is a sign to display it more prominently?

Analytics can shed some light onto whether your users are finding what they are looking for in your documentation. If you don’t have this setup on your developer portal, we recommend using a privacy-focused solution like Plausible.

User feedback

Sometimes, you can just ask your users if they found what they wanted. Featuring a “was this helpful?” form with “yes” and “no” options in your articles can act as a high level indicator of quality.

That being said, response rates for these types of widgets can be low. Also, the people who do respond to them can skew towards negative feedback. You’ll have to take whatever feedback you get from these forms with a grain of salt and put it into a larger context.

Commit Frequency

As a bonus, this is a trick you can do for any documentation that you have in source control. Look at how many lines of code have changed in your application in the last 6 months, and compare that to how many lines of documentation for that application have changed.

If the number of lines changed in your docs is near zero, while your product keeps moving onward, you may have a problem.

Docs or it didn’t happen.

Maintaining quality

Software engineers add automated checks to verify their code to check for formatting, failing tests, security issues, and much more. This way anyone working on the code base has to follow the same rules and pass a minimum quality bar. We can take a similar approach to documentation.

The best time to do these checks is always. On each change, you want to be running these tests to verify that your style guides are enforced and no issues have crept in.

The added benefit is that this makes it easier for non-experts to contribute. If a software engineer is adding documentation, they can rely on the tooling to prevent them from making mistakes or stylistic errors. It also removes some of the burden from technical writers who have to review these changes as they can focus on more higher level aspects of the content.

Nothing beats expertise

No tool can ever replace the expertise of an experienced technical writer who understands their audience. But we can use tools to assist us and warn us about possible issues. The goal is to save time by removing the tedium of for example manually checking every link on a developer portal.

If this sounds exciting to you, or you have other ideas for measuring and maintaining documentation quality, let me know! You can reach me on nik@doctave.com. I’m building some of the features discussed here into Doctave itself.

Also make sure to sign up to the early access list to be among the first to get to use Doctave.

Doctave is a docs platform designed for docs-as-code

Tired of wrangling open source tools and complex documentation deployments? Doctave makes deploying documentation sites with docs-as-code easier than ever.

Articles about documentation, technical writing, and Doctave into your inbox every month.