Public Value Framework

By using our Public Value Framework we assess all that we do against how it will achieve outcomes that deliver public value. 

Assured alignment

Criteria

  • Clarity of goals: How well does the project* address the published goals of HE?
  • Appropriate KPIs: Is performance measurable at different points in the delivery chain?
  • Links to other work: Are interdependencies recognised and any consequent risks managed?
  • Quality of track record: Are any concerns about project* staff addressed appropriately?
  • Measures drive good behaviour: Might the measures create perverse incentives?

Scoring

Clarity of goals

  1. The project* is over- or under-ambitious compared with comparators  and has no obvious connection with the outcomes of the HE Logic Model
  2. The project* appears to address the HE Logic Model outcomes but the goals are limited to outputs rather than outcomes
  3. The project* has realistic outcome-focused goals  but the connection with the HE Logic Model outcomes is unconvincing
  4. The project* has challenging but realistic outcome-focused goals that connect plausibly with the outcomes of the HE Logic Model. 

Appropriate KPIs

  1. There is little use of performance measures and confusion about the relationship between outputs and outcomes
  2. Performance indicators are well spread along the delivery chain but some or all are unmeasureable
  3. Performance indicators are measureable but concentrated at one end or other of the delivery chain
  4. Measurable performance indicators are identified at different points on the delivery chain.

Links to other work

  1. Critical inter-dependencies are not appreciated or left unaddressed
  2. Interdependencies are poorly understood, even if those recognised are minimised
  3. Interdependencies are well understood but inadequately mitigated
  4. Interdependencies are well understood and associated risks are convincingly minimised

Quality of track record

  1. Recent poor performance in the delivery chain has not been recognised and addressed
  2. Recent poor performance in the delivery chain is acknowledged but unconvincingly addressed
  3. Doubts about recent performance in the delivery chain are recognised and addressed convincingly
  4. Recent performance of the whole delivery chain engenders confidence

Measures drive good behaviour

  1. Measures are solely internal, infrequent and likely to distort behaviours
  2. Measures are internal and infrequent but drive appropriate behaviour
  3. Performance measures engage stakeholders only at the beginning and end but will inform delivery and drive appropriate behaviours
  4. Measures are frequent enough to inform delivery, track stakeholder perceptions and drive the right behaviours

Scoring table

CriteriaScore (out of 4)
Clarity of Goals/4
Appropriate KPIs/4
Links to other work/4
Quality of track record/4
Measures drive good behaviour/4
Average Score/4

Back to top.

Appropriate resourcing

Criteria

  • Measurement of unit costs: Has good use been made of unit costs to allow easy assessment of efficiency?
  • Detailed resource plan: Does the financial plan show how costs will be met, including unexpected costs?
  • Timely information to managers: will decision-makers be given the right information at the right time?
  • Optimised funding mix: Have funding options been properly explored?
  • Awareness of knock-on costs: Has the risk of cost-shifting been assessed and mitigated?

Scoring

Measurement of unit costs

  1. No comparison of costs will take place during the project to assess value for money
  2. Measures will not allow unit costs to be analysed without considerable extra effort
  3. Measures will allow unit costs to be analysed but no active use is made
  4. Unit costs will be reviewed at intervals to enable efficiency to be maximised

Detailed resource plan

  1. The project* cannot show how it will meet its objectives with the allocated budget and relies on additional resources to cope with unexpected costs
  2. The project* has a weak financial plan with little or no contingency planning in case of unexpected costs
  3. The project* has a sound financial plan but no contingency planning in case of unexpected costs
  4. The project* includes a detailed plan for how it will meet its objectives with the allocated budget and contingency plans for unexpected costs

Timely information to managers

  1. Financial management information is unavailable to decision-makers in time to inform decisions
  2. Lagging management information is to be provided but does include financial measures
  3. Timely management information is to be provided but is weak on financial measures
  4. Measures are included to ensure that timely and consistent financial management information is available to decision-makers

Optimised funding mix

  1. No evidence is provided that the project is optimally funded
  2. The evidence that the project is optimally funded is weak and the risks of external sources has not been included
  3. A review of potential funding models has been carried out to ensure that the project is optimally funded but the risks of external sources has not been included
  4. A detailed review of potential funding models, included risks of external funding, has been carried out to ensure that the project is optimally funded

Awareness of knock-on costs

  1. The risk of shifting costs to other public bodies is not appreciated
  2. The risk of shifting costs to other public bodies has been acknowledged but neither quantified nor shared with stakeholders
  3. The risk of shifting costs to other public bodies has been considered but not shared with stakeholders
  4. The risk of shifting costs to other public bodies has been considered and shared with stakeholders

Scoring table

CriteriaScore (out of 4)
Measurement of unit costs/4
Detailed resource plan/4
Timely information to managers/4
Optimised funding mix/4
Awareness of knock-on costs/4
Average Score/4

Back to top.

Public support

Criteria

  • Stakeholder needs: Have key stakeholders been identified clearly and are their needs well understood?
  • Public perceptions: Are the drivers of public support understood and is support evidenced?
  • User experience: To what extent does the project* demonstrate an appreciation of the value of great user experiences?
  • Public participation: Has enough effort been made to enable public participation?
  • Stakeholder influencing: How will stakeholders be made aware of the project* and its results?

Scoring

Stakeholder needs

  1. The project* appears unaware of its key stakeholder groups and what they want
  2. The project* identifies only a sub-set of its key stakeholder groups and what they want
  3. The project* identifies its key stakeholder groups but not what they want
  4. The project* identifies its key stakeholder groups and what they want

Public perceptions

  1. The project* demonstrates no awareness of public opinion or the public are expected to be hostile to it
  2. The project* demonstrates a weak understanding of the drivers of public support and lacks evidence of public approval of the goals
  3. The project* demonstrates a good understanding of the drivers of public support but lacks evidence of public approval of the goals
  4. The project* demonstrates a good understanding of the drivers of public support and includes evidence of public approval of the goals

User experience

  1. No apparent understanding of the link between better user experiences and better outcomes nor of how improved user experiences might be achieved
  2. The link between better user experiences and better outcomes is poorly understood and the way the project* delivers better experiences is not clear
  3. The link between better user experiences and better outcomes is understood but the way the project* delivers better experiences is not clear
  4. The link between better user experiences and better outcomes is understood and the way the project* delivers better experiences is clear

Public participation

  1. Public participation is less than for similar initiatives without a convincing justification
  2. Public participation is less than similar initiatives but there is a convincing explanation
  3. Public participation is not benchmarked against similar initiatives but efforts have been made to maximise it given the circumstances
  4. Public participation is benchmarked against similar initiatives and maximised in the circumstances

Stakeholder influencing

  1. Actions to influence stakeholders’ perceptions are absent or unconvincing
  2. Actions to influence stakeholders’ perceptions are included but there is a poor track record of influencing
  3. Actions to influence stakeholders’ perceptions are unconvincing but there is a good track record of influencing
  4. Actions to influence stakeholders’ perceptions are convincing and based on a good track record of influencing

Scoring table

CriteriaScore (out of 4)
Stakeholder needs/4
Public perceptions/4
User experience/4
Public participation/4
Stakeholder influencing/4
Average Score/4

Back to top.

Capacity development

Criteria

  • Well-designed evaluation: Is a proportionate evaluation built into the project* to enable learning about what works and for whom?
  • Use of new technologies: Has the project* looked for opportunities to experiment and improve?
  • Clear accountability: Is everyone involved going to be clear about their roles and responsibilities?
  • Cross-boundary collaboration: Has the potential for collaboration been properly explored?
  • Resilience: Is it clear how the system is going to be fitter for the future as a result of the project*?

Scoring

Well-designed evaluation

  1. Little evaluation taking place and conducted remotely from the front-line staff
  2. Little evaluation taking place but conducted with the front-line staff
  3. Limited monitoring and evaluation within project* but conducted throughout the delivery chain
  4. Both performance monitoring and evaluation (that involves front-line staff) are integrated into the project*

Use of new technologies

  1. The project* has not prioritised new technologies to improve outcomes
  2. The project* gives little attention to using new technologies to improve outcomes
  3. The project* gives limited attention to using new technologies to improve service delivery, reduce costs or improve outcomes
  4. The project* has clearly considered the use of new technologies to improve service delivery, reduce costs or improve outcomes

Clear accountability

  1. Nominal overall responsibility but implausible accountability and focused on process rather than outcomes
  2. Nominal overall responsibility but plausible accountability and focused on outcomes
  3. Main people in delivery chain have clear responsibilities and accept accountability for progress towards outcomes
  4. All key people in the delivery chain have clear responsibilities and accountability for outcomes/progress towards targets

Cross-boundary collaboration

  1. No track record of effective collaboration across boundaries and no plans to share collaboration lessons
  2. Poor track record of effective collaboration across boundaries but commitment to learning and sharing collaboration lessons
  3. Plans include cross-boundary collaboration but patchy track record of effective collaboration across boundaries mitigated by commitment to learning and sharing collaboration lessons
  4. Cross-boundary collaboration is woven into design and delivery, including sharing lessons, and staff have a good track record

Resilience

  1. Focus only on short-term results with no capacity-building measures
  2. Some capacity-building and resilience measures but internally focused
  3. Capacity-building and resilience measures largely internally focused but also consider wider system
  4. The project* includes measures to build system capacity and as a result is resilient to changes in staffing

Scoring table

CriteriaScore (out of 4)
Well-designed evaluation/4
Use of new technologies/4
Clear accountability/4
Cross-boundary collaboration/4
Resilience/4
Average Score/4

Back to top.

*The term project has been used here but is interchangeable with programme for these purposes