Concise Metrics

I’m nearing the end of my series of blog posts exploring the acronym METRICS, idenitfying what metrics could, or probably should be to ensure that you are getting maximum value from them. I have previously taken a look at:-

In this post, I look at ‘Concise Metrics’.

Less can defiantly mean more with regards to metrics. Trying to include every piece of data possible on a single graph will often confuse the situation and raise more questions than answers. Similarly, creating chart after chart after chart is not the answer.

I went to a meeting a few weeks ago that contained two groups who were both looking at metrics. One team gave a demonstration of the metrics they were looking to produce as part of and Agile transformation. They walked through 27 metrics across four themes and while they were describing these, mentioned that were looking to add more. I questioned why? More metrics does not increase understanding of where you have been, where you are now or where you are going? If fact, creating metric after metric could diminish their value and put people off using them.

Metrics should be produced for a reason, and/or to answer a specific question. Are we on track to deliver the Sprint forecast, could be answered by the use of a Sprint Burndown chart. When are we likely to deliver the next MVP (minimum viable product), could be gleaned from a Burn-up chart.

There has to be value behind the production of a metric. If there is no value, why bother creating it in the first place? Even if you do create it, you may then get questions like ‘so what?’, or, ‘what does this tell me?’. When producing and sharing a metric, you should be ready to explain the narrative behind it.

For me, the primary purpose of metrics is for the delivery team (as noted in Repeatable Metrics). If we were to write this as a user story, it may go something like this:-

As a Delivery Team

I want to see graphs/charts relating to our delivery

So that we can make informed decisions based upon empirical evidence

Ideally, metrics should stand alone and tell a story. However, this is not always possible. During the analysis of a metric, it may become apparent that further investigation is required and that there is a need to delve into the data and look at a specific item more closely. This is ok. It is this further investigation, which will provide a greater understanding and is more likely to result in an appropriate decision being made. Decisions made without understanding root causes may end up making things worse.

In order to make informed decisions metrics should be concise, simple and easy to understand. Although they are likely to be understood by their author, it may be necessary to create example charts or user guides that can be displayed as a guide to interpretation. I took this approach in a previous role and created a guide to Burn-up charts in order to help others understand what the chart showed and how to interpret the information displayed.

I produced a 1-page A3 document that identified what a Burn-up chart is, using multiple versions of the same chart to evolve it from the simple, to the complex, while at the same time highlighting the additional insights that could be gleaned. The first charts contained just two lines – progress and scope. This was built upon in the next chart by adding a line for average velocity. This increases the value of the chart as this line could be followed to provide a forecasted delivery date. Taking this a step further, I added lines to forecast delivery dates if we went faster/slower, which provided me with a ‘forecasted delivery window’. The final stage was to add multiple releases to the chart.

Whatever metrics are produced, value is key. Value is likely to come from only a few key metrics. The metrics you use may change or evolve over time depending on the stage of your delivery. Again, this is something that I would expect. As teams and processes mature, or, as a delivery progresses, you may start to look at different areas a for continuous improvement, which may mean the use of new/different metrics.

Through different mediums (talks, conferences, books etc.) I have heard people talk about a Scrum Master ‘toolbox’. One of the most common toolboxes is a ‘retrospective toolbox’, which contains various Sprint Retrospective formats. Taking this analogy a step further, Scrum Masters may need more of a tool chest than a tool box. This could include a draw for Sprint Retrospectives, a draw for facilitation techniques, a draw for coaching techniques and a draw for metrics (to name only a few).

Remember, just because you have a number of metrics at your disposal, does not mean that they have to be produced, displayed and/or shared. When producing a metric, try to keep it simple, but most importantly, ensure that there is value in its production.

My challenge to you is to review your metrics and identify if they can be made more concise. Identify the metrics that are providing value, and, if there are any that are not providing value, question whether you need to continue producing them.