In This Article
Start Here: Identify the Symptom
- 1. No Analysis Results Are Displayed
- 2. Analysis Results Look Empty or Zero
- 3. Results Look Extreme or Unexpected
- 4. Attribution Shows 100% or Overlapping Contributions
- 5. Variance Impacts Do Not Add Up
- 6. Results Differ Between Users
- 7. When Analysis Results Change Unexpectedly
Overview
This article helps you diagnose and resolve common issues encountered when running analysis in Value Hound. It focuses on problems related to analysis setup, data availability, and interpretation of outputs.
Structural and KPI configuration issues are covered in Troubleshooting Value Driver Trees.
Start Here: Identify the Symptom
Before troubleshooting, identify what you are seeing:
- No chart or table appears after running analysis
- Analysis runs but shows empty or zero results
- Results look extreme or counterintuitive
- Attribution shows 100% or overlapping contributions
- Variance impacts do not add up as expected
Once you identify the symptom, use the relevant section below.
1. No Analysis Results Are Displayed
Common symptoms
- Chart area is blank
- No values appear after selecting Run
- Results do not update when inputs change
What to check
- KPI values exist: Confirm that all KPI nodes in the tree have values for the selected field and period.
- Field selection: Ensure the selected base, control, or analysis fields (Actual, Target, Baseline) contain data.
- Tree calculation resolves: Confirm that the Value Driver Tree outcome node displays a value before running analysis.
- Analysis has been run: Ensure Run (or Auto Run) has been triggered after changing settings.
Typical fixes
- Switch to a view with populated fields
- Add missing KPI values
- Re-run the analysis
2. Analysis Results Look Empty or Zero
Common symptoms
- All bars are zero
- Outcome difference is zero
- Sensitivity shows no variation
What to check
- Control and analysis fields are different: In Attribution Analysis, if Control and Analysis fields are the same, the result will be zero.
- Variance inputs applied: In Variance Analysis, confirm that non-zero variances have been entered.
- Uniform KPI values: If all KPIs have identical values across fields, there may be no difference to analyse.
Typical fixes
- Select different control and analysis fields
- Enter meaningful variances
- Verify KPI values differ between fields
3. Results Look Extreme or Unexpected
Common symptoms
- One KPI dominates Sensitivity or Attribution
- Very large positive or negative impacts
- Small changes produce large swings
What to check
- Multiplicative logic: Identify calculations that multiply values (for example, rate × time × price).
- Upstream drivers: KPIs high in the tree can dominate analysis results.
- Zero or near-zero values: Targets or baselines set to zero can cause extreme impacts.
Typical fixes
- Validate KPI target and baseline values
- Review calculation structure
- Use Variance Analysis to sanity-check individual impacts
4. Attribution Shows 100% or Overlapping Contributions
Common symptoms
- One KPI shows 100% contribution
- Multiple KPIs show contributions exceeding 100% in total
Why this happens
- Attribution assigns overlapping explanatory impact to KPIs at the output node. In non-linear or multiplicative trees, multiple KPIs can each explain a large portion of the same outcome difference.
- This is expected behaviour, not an error.
What to check
- Confirm the outcome difference being analysed
- Review which KPIs changed between fields
- Validate model structure and leverage points
5. Variance Impacts Do Not Add Up
Common symptoms
- Combined variance results are smaller than expected
- Individual variance impacts do not sum linearly
Why this happens
- Variance impacts are not additive when KPIs interact through shared calculations or constraints.
What to check
- Whether multiple KPIs affect the same part of the tree
- Whether upstream KPIs limit downstream effects
6. Results Differ Between Users
Common symptoms
- Two users see different analysis results
- Charts differ despite using the same tree
What to check
- Access and position context: Users may be analysing different data scopes based on their position.
- View and field selection: Confirm both users are using the same view and analysis settings.
- Recent data changes: Check KPI audit logs for recent updates.
7. When Analysis Results Change Unexpectedly
If analysis outputs change without rerunning the analysis:
What to check
- KPI value updates
- Idea changes affecting inputs
- Tree edits made by other users
Remember:
- Analysis reflects the current state of data and structure
- Exported analysis results do not update automatically
Key Point to Remember
Value Hound analysis reflects the current data and model state. Unexpected results are usually caused by data changes, field selection, or model structure — not calculation errors.
What Happens Next
If issues persist:
- Re-run analysis with simplified settings
- Validate KPI values and tree structure
- Use Variance Analysis to isolate effects
For interpretation guidance, refer back to:
Next Steps
To learn about frequently asked questions, see: