In This Article

Overview

Understanding Chart Orientation and Ranking

Reading Sensitivity Analysis Charts

Reading Attribution Analysis Charts

Reading Variance Analysis Charts

Reminder: Understanding Colours and Direction

Reading Tables and Numeric Outputs

Common Interpretation Pitfalls

What Happens Next

Next Steps


Overview

Value Hound analysis results are presented visually using charts and ranked tables. These outputs are designed to help you understand relative impact, contribution, and outcome change across drivers.

This article explains how to read, interpret, and typically act on the results produced by Sensitivity, Attribution, and Variance analysis.


Understanding Chart Orientation and Ranking

Across all analysis types, charts are ordered by impact on the outcome.

  • Drivers at the top of the chart have the greatest impact
  • Drivers lower down have progressively smaller effects
  • The horizontal axis represents impact magnitude, not performance quality

The ranking shows importance, not whether a KPI is good or bad.


Reading Sensitivity Analysis Charts

Sensitivity Analysis results are typically displayed as a tornado-style bar chart, with:

  • One side of a bar showing the impact of a negative variance
  • The other side of a bar showing the impact of a positive variance

Key points when interpreting sensitivity charts:

  • Longer bars indicate higher sensitivity of the outcome to that KPI
  • Bars are centred on zero to show relative change
  • Direction shows how the outcome responds when the KPI changes
Sensitivity charts answer the question: "Which drivers matter most if they change?"


What to Do with Sensitivity Analysis Results

Primary use: Prioritisation

Use Sensitivity results to:

  • Focus attention on the top-ranked drivers
  • Decide where effort or investigation will have the biggest impact
  • Identify KPIs that warrant closer monitoring
  • Screen out low-impact drivers from further analysis

Typical actions:

  • Select the top 3–5 drivers for deeper review and target initiatives toward making impact on those drivers
  • Use these drivers as inputs to Attribution or Variance analysis

What not to do:

Do not treat sensitivity rankings as targets or poor performance.


Reading Attribution Analysis Charts

Attribution Analysis results are displayed as contribution bars, showing how individual KPIs contribute to the difference between two selected fields (for example, Actual vs Target) at the Value Driver Tree output node.

The total difference refers to the difference between:

  • the outcome value calculated using the Control Field (for example, Target), and
  • the outcome value calculated using the Analysis Field (for example, Actual)

Each bar represents the impact of moving a single KPI from its control value to its analysis value, while all other KPIs are held constant. Contributions are measured at the output node (for example, Revenue or Profit), not at the KPI level.

Key points when interpreting attribution charts:

  • Each bar represents a KPI's contribution to the outcome difference, not a share of the KPI's own change
  • Positive bars move the outcome towards the analysis field value
  • Negative bars move the outcome away from the analysis field value
  • The net contribution across all KPIs explains 100% of the outcome difference

In complex or multiplicative Value Driver Trees, multiple KPIs may each explain a large proportion of the same outcome difference. For example, one KPI may show a 100% contribution while other KPIs show contributions of 50%, 40%, or 30%. This occurs because attribution measures overlapping explanatory impact, not mutually exclusive slices of a fixed total.

A KPI showing 100% contribution means that, on its own, the change in that KPI is sufficient to explain the entire difference at the outcome node, given the current model structure. It does not mean that other KPIs are irrelevant, nor does it imply that contributions should be added together.


Attribution Analysis answers the question: "Which KPIs explain the difference between the two outcome values?"


What to Do with Attribution Analysis Results

Primary use: Explanation and accountability discussions

Attribution Analysis explains what drove the difference between two states (for example, Actual vs Target).

Use Attribution results to:

  • Explain performance outcomes to stakeholders
  • Structure performance review discussions
  • Identify which drivers contributed most to gaps or overperformance
  • Separate signal from noise in complex results

Typical actions:

  • Use the top contributors to frame "why" conversations
  • Assign follow-up analysis to understand root causes
  • Validate whether the model structure reflects reality

What not to do:

Do not assume attribution explains root cause on its own.


Reading Variance Analysis Charts

Variance Analysis results are displayed as impact bars, showing how applying specific changes to one or more KPIs affects the Value Driver Tree output node (for example, Revenue or Profit).

The reference point for Variance Analysis is the outcome value calculated using the selected base field (for example, Actual). All impacts are measured relative to this baseline outcome.

Each bar represents the impact on the output node after applying the defined variance to a KPI, while all other KPI values remain unchanged. When multiple KPIs are adjusted, the results reflect the combined effect of those changes as they flow through the Value Driver Tree calculations.

Key points when interpreting variance charts:

  • Each bar represents the impact of an applied KPI change on the outcome value
  • Larger bars indicate greater impact on the outcome
  • Direction shows how the outcome changes relative to the baseline
  • Impacts are measured at the output node, not at the KPI level

In Value Driver Trees with multiplicative or interacting drivers, variance impacts are not additive. Multiple KPIs may influence the same part of the calculation, so changing two KPIs together may produce a different result than adding the effects of changing each KPI independently. 

Variance Analysis answers the question: "What would the outcome be if these specific changes were applied?"


What to Do with Variance Analysis Results

Primary use: Scenario testing and option comparison

Variance Analysis is used to explore hypothetical changes and quantify their potential impact on outcomes.

Use Variance results to:

  • Test combinations of potential improvements
  • Compare alternative scenarios or options
  • Quantify trade-offs between different drivers
  • Support planning and decision-making discussions

Typical actions:

  • Model realistic improvement scenarios based on high-impact drivers
  • Compare multiple scenarios to understand relative impact
  • Use results to inform target setting or initiative sizing

What not to do:

  • Do not treat variance results as forecasts
  • Do not assume impacts scale linearly
  • Do not assume scenarios are achievable without further validation

Reminder: Understanding Colours and Direction

Colour is used consistently across Value Hound charts to indicate direction of impact, not performance status.

Important considerations:

  • Green and red indicate directional change, not success or failure
  • Direction must be interpreted alongside KPI expected trend and calculation logic
  • Colour does not indicate whether a KPI is on or off target
  • Always interpret colour in the context of the calculation

Reading Tables and Numeric Outputs

Charts are accompanied by tables showing:

  • Absolute impact values
  • Percentage changes
  • Applied variances or deltas

Tables provide precision and should be used to:

  • Validate chart interpretation
  • Support reporting and export
  • Compare exact magnitudes across drivers

Use tables when accuracy matters more than visual ranking.


Common Interpretation Pitfalls

  • Assuming top-ranked drivers are underperforming
  • Treating sensitivity results as targets
  • Using attribution results to assign accountability
  • Treating variance scenarios as forecasts

Value Hound outputs show impact, not cause or intent.


What Happens Next

You can now:

  • Run analysis using Value Hound
  • Interpret charts and rankings confidently
  • Use outputs to support decision-making and discussion

The next article covers how to share, export, and govern analysis outputs across teams and stakeholders.


Next Steps

To learn about sharing and collaborating in WiredUP, see: