3/26/2020

This article is the second in a three-part series about using analytics to drive documentation efforts and achieve business goals.  To read the first part, click here.

In the first part of this series, we covered some of the main reasons for gathering data about how your documentation is performing, who's consuming it, and how it's being used. We explained how data can provide benefits not only to writers and their managers, but also to your audience and your entire organization.

What makes data meaningful?

To achieve the above benefits, it’s not enough to just gather the data. There are dozens of metrics that can be gleaned from your documentation, and they can be valuable in many different ways. Their usefulness depends on the perspective you have, and how you plan to use those measurements to achieve your goal.

We’ll soon look at some examples of useful metrics, but first, let’s explore what makes data meaningful in the first place. Having this knowledge will help you take important first steps in forming your analytics strategy.

Representative Data

To be effective, data must be representative of all your documentation. If your content is distributed across multiple sites or access points, and you only measure data on some of those points, your data will be incomplete.

There’s nothing wrong with incomplete data as such, as long as it is representative of the complete data set. But in the case of product content, if you collect data from only one access point, you run the risk of drawing inaccurate conclusions.

For example: if you’re only collecting data from your dev portal, you’re only seeing the behavior of programmers reading your SDK and API content. Those users may behave very differently than an IT professional reading sysadmin content, or an end user finding their way around your UI.

It would therefore be ill-advised to extrapolate from the small data set you received about developers and generalize it to your entire doc audience.

Ideally, of course, the most representative sample of data is a complete set of data. It would be excellent if you could gather all the search terms from all your audience segments for all your documentation’s access points.

Having all your measurements in one cohesive framework presents a significant advantage in ensuring that your analysis is comprehensive.

Goals Add Direction to Data

Data is about helping you achieve and measure success (and avoid failure). So in order to turn it into an effective tool, you’ll need to figure out what your goals are. Goals add direction to your data, and even add a layer of judgement, turning a sea of numbers into a clear determination of what is forwards versus backwards, letting you know whether you’re on the right track.

Here are some examples of content KPIs you could set forth to provide guidance and meaning to your data collection:

  • Provide complete documentation for the 10 most popular topics
  • Provide complete documentation for the 10 most popular searches
  • Ensure that no more than 2% of searches lead to “no results”
  • Consistently beat industry benchmarks for search effectiveness and search relevancy
  • Either update or discontinue all content that is 3+ years old
  • Maintain an average article feedback rating of 4.0+
  • Update or discontinue any article with a feedback rating below 3.0
  • Translate the top 10% of articles into all relevant languages
  • Create synonyms for the 20 least successful search terms

How to use data effectively

Your data may be meaningful, but for you to use it effectively, it should also be actionable. If you’re not able to use data to improve your performance, you’ll have difficulty justifying the cost and effort of obtaining it – and likely find little reason to keep doing so.

So now that we’ve laid down the requirements for collecting meaningful data, let’s distinguish between three types of data and see how you can use them effectively.

Useless Data

This category is a bit misleading; there’s no such thing as useless data, but it will be useless to you if it’s not tied into your goals. In other words, the usefulness of any data depends on what you want to accomplish and how you plan to use it.

Take, for example, this chart that shows daily traffic volume for a documentation site.

Documentation traffic

It shows that people visit the site more during the day, less at night, and significantly less over Christmas and New Year. Unless you’re making the argument that people don’t work hard enough on weekends (and on holidays), this metric might seem pretty useless.

But what if your IT department is planning a system upgrade and wants to know the best time to carry it out to minimize user impact? This data suddenly becomes quite valuable (even if the answer to the question is a bit obvious).

Scorecard Data

This type of data indicates where you stand in your documentation efforts. Although scorecard data doesn’t suggest a specific course of action, it can reveal where your content is succeeding or lagging in comparison to your industry peers.

Take, for example, search effectiveness, which shows the percentage of searches that led to a click (within your documentation site), and a blue line that shows the industry benchmark:

Documentation search effectiveness

By comparing this data to industry benchmarks, you can see how well you are doing – a kind of scorecard showing you’re above the industry average – but there’s no obvious action here that would help you make this score even better. Instead, it is only a generic reflection that you are doing well and that your scorecard is above average.

While scorecard data can be useful to show to managers, it is not effective in helping you discern a course of action that might move the needle to a better score.

Actionable Data

If scorecard data shows where the needle is pointing, actionable data can show you how to make changes that would move the needle. In many ways, this is the holy grail of data collection.

It is a call to action – a clear set of tasks to improve your performance. The clearest and simplest example of actionable data is seeing a list of search terms that led to no search results.

This analytic provides very clear insights, and you’re only one step away from acting on them. If people are searching for terms that are not found in your content, it is a clear call to action to either add that content, or tweak the search engine to guide users to find the correct articles when they type in those search terms.

Documentation searches with no results

Another example of actionable data may be content age – the number of topics that haven’t been updated in X amount of time. On its own, that’s scorecard data. But if you add a goal that all online content should never be older than six months, this becomes a call to action to renew content which has grown stale.

Content aging of documentation

Actionable data enables you to make conscious decisions that directly impact the effectiveness of your content, putting you in charge and making your decisions targeted towards moving the needle and measurably improving the situation. In the third and final part of this series, we'll look at concrete examples of how to turn your data into content strategy.

This article is an excerpt from "Becoming a Data-Driven Documentation Team" which was published in the December 2019 issue of Best Practices, a publication of CIDM.

Back to all posts