I can’t help noticing how widely software teams are leaning on DevOps to deliver features and fixes faster. One core tenet of DevOps is to make data-driven decisions to maintain high quality under the pressure of frequent production deployments. This calls for no-nonsense metrics and a DevOps dashboard that gives actionable insights on what you should fix to improve quality or speed, every day. However, that is easier said than done. Most teams I have met collect data to measure something but fewer use the collected data for making decisions regularly.

Finding the no-nonsense DevOps metrics

The most typical measurement pitfall that I have noted is focusing on vanity metrics i.e. metrics that might look nice but don’t help you improve anything. Another common issue that makes teams forget their dashboards is focusing only on the lagging indicators – the outcomes. Those metrics may measure relevant things but they don’t tell what factors have caused the results.

You can avoid vanity metrics by linking the metrics to your goals explicitly. In my own everyday work, the intent to deliver fast with quality boils down into the following four important questions:

  1. Release quality: Is the current release candidate ready for deployment and if not, what should I fix?
  2. Production quality: What do I need to improve to avoid service outages and maintain good user experience?
  3. Customer satisfaction: How can I improve the service or product in use and customer satisfaction?
  4. Velocity: How can I accelerate the speed of delivering value?

A practical DevOps dashboard needs to help in answering these operational questions.

Value Creation Model reveals the causes for your results

We can find the leading indicators, so the factors affecting the goals, by taking a systemic view to the DevOps process. I have found Value Creation Model to serve well this very purpose.

The following picture presents a Value Creation Model for DevOps. Each node is a measurable factor. The arrows indicate the assumed causalities among the nodes and thus the causality chains reveal the factors that have an impact to speed or quality. A blue arrow denotes a positive causality between two factors. For instance, when Production deployment frequency increases, so does the Pace of value deliveries. A red arrow means that the variables move to the opposite direction. The higher is Technical debt, the lower is Sprint predictability. The colors of circles show the overall status of each measured factor.

DevOps Value Creation Model

Measure and improve Value Paths

Value Paths are particularly important causality chains in the model, leading to an important goal. There are four paths, one for each issue listed above. Path of Release Quality is highlighted on a Qentinel Pace™ dashboard below. The chain of metrics starting from Code quality are leading indicators for Production release quality which in turn correlates with Production deployment success. 

DevOps dashboard

The DevOps dashboard should be organized according to the four DevOps Value Paths so that there is a metrics tree measuring each. This ensures that we focus on what matters most.

There are normalized index scores (target = 100) for each metric tree. This approach allows us to present metrics with different scales and measurement units on the same scale. Index values are easy to judge in terms of ‘good’ (> 100) and ‘bad’ so you don’t need to be a subject matter expert to interpret each result. As the example shows, there can be metric trees under metric trees (and index for each) so it is possible to calculate a DevOps Value creation index for ‘everything’ and observe the trend.

There are also calculated index scores for (current) production release quality and release candidate quality to facilitate data-driven release decisions. From the picture we can see that the release candidate is much worse than the current production release (quality index 62.17 vs. 86.23). Now that the metrics trees have been organized according to the assumed causalities, it is easy to find the right levers to turn. With a quick look at the red circles in the dashboard we see that we need to fix a security showstopper and some system performance bugs to get CPU loading and service response times within the desired control band.

Modeling the causalities among metrics opens exciting opportunities also for leveraging Machine Learning. Machine Learning algorithms can detect trend changes and patterns more effectively than human eye and make predictions as well as recommendations. Value Creation Model gives a nice starting point for building Machine Learning capabilities for analytics. We can also validate our causality assumptions with data.

DevOps dashboard provides actionable insights

Concluding, a no-nonsense DevOps dashboard provides actionable insights. The metrics must be linked to the goals of DevOps, so speed and quality in its broad meaning including customer experience and satisfaction. The DevOps dashboard needs to answer everyday operative questions on quality and speed and for that we need to know what factors are affecting team’s results – in a good or bad way.
I have found Value Creation Model useful as a tool for identifying practical metrics and their leading indicators. The presented approach to organize the DevOps dashboard according to the DevOps Value Paths and metric trees under each gives practical indications for what actions you need to take to achieve your quality and speed goals.

If you want to see the value paths addressing the other three issues and learn more about Quality Intelligence® for DevOps, have a look at my full white paper How DevOps creates value and how to measure it.


Juha-Markus Aalto is Director, Product Development at Qentinel and he leads the DevOps team that builds and operates Qentinel Pace SaaS product for robotic software testing. He is SAFe® Contributor and has acted as SAFe Advisor in the large-scale Lean-Agile transformation programs supported by Qentinel.

Quality Intelligence® is registered trademark of Qentinel.

Topics:

Value Creation Model, DevOps


Juha-Markus Aalto

Juha-Markus Aalto

juha-markus.aalto@qentinel.com

Show all posts