Creating a new NHS England: Health Education England, NHS Digital and NHS England have merged. Learn more.
Evaluation of the Regional Scale Programme and the National Innovation Collaborative - executive summary
Published 21 October 2021
In December 2020, NHSX commissioned Ipsos MORI, working in partnership with the Strategy Unit, to undertake the independent evaluation of the Regional Scale Programme (RSP) and the National Innovation Collaborative (NIC). This is the executive summary of the final evaluation report prepared by Ipsos MORI. It sets out a summary of key findings across the objectives of the evaluation.
The RSP is leading work across England to accelerate the deployment of new care pathways supported by technologies that enable remote monitoring. The programme, which is delivered in partnership with the NHS and care systems locally, was supported by £10.5 million implementation funding. This was then supplemented by an £18 million investment to support licence costs as part of a specific COVID-19 response. This funding supported 24 projects across the seven regions of England.
The RSP is supported by the NIC. This is a programme of support that aims to enable collaboration and, in turn, the rapid sharing of learning and best practice in digital transformation across the NHS and care system. The Academic Health Science Network (AHSN) Network was commissioned by NHSX to support the NIC.
Through implementing these technologies in health and care settings, the RSP and NIC aim to realise benefits related to system efficiencies, improved experience for staff and service users, and improved health outcomes.
The evaluation focused on the processes, outcomes and short-term benefits of the RSP and NIC. The work was structured into the following three phases:
- A scoping phase, involving consultation with the programme team, regional leads and the AHSN Network to establish an underlying theory of change for the programme.
- A main phase, comprising interviews with stakeholders involved in the oversight and management of the 24 projects.
- A consolidation phase, comprising interviews with additional stakeholders at the project level, interviews with technology suppliers, an evidence scan of secondary literature and final analysis of the programme’s management information (MI).
Fieldwork with patients and service users who have used remote monitoring technologies has also been completed. The findings from this aspect of the evaluation will be reported separately.
At the time of evaluation fieldwork, the short timescales since project inception (September 2020) coupled with the pressures of the COVID-19 pandemic meant that many of the 24 projects were at a relatively early stage of development. As a result, some anticipated longer-term benefits of the programme (such as improved health outcomes, improved clinical safety, improved patient experience and time savings for staff) had not been realised and/or evidenced by projects. Anticipation of this limitation by the programme and evaluation teams informed the evaluation's focus on programme processes and short-term benefits.
Self-reported digital maturity levels
The programme engaged health and social care organisations with a wide range of self-reported digital maturity levels.
Notably, the level of digital maturity among care providers participating in the programme was particularly low.
Among organisations with low self-reported digital maturity, the pandemic was a key driver of participation. Interviewees suggested that the urgent service challenges stemming from the pandemic resulted in a culture of increased openness to digital platforms (both among staff and service users), which facilitated their participation.
The variety of organisations willing and able to implement remote monitoring solutions suggests that the programme has identified a significant unmet need in health and social care. Indeed, a wider review of the evidence suggests that action on remote monitoring technologies at this national scale has not been attempted before.
Levels of adoption
All funded projects achieved some level of adoption of remote monitoring (by patients, service users or staff) by March 2021.
It should be noted, however, that the scale of adoption varied considerably by project. By June 2021, 79,643 patients and other users had been onboarded onto the projects nationally.
For the majority of projects, the introduction or scaling of remote monitoring involved fundamental redesigns to models of care. These included where care was delivered, who delivered it and referral pathways. Given the challenging context in which projects were delivering these major changes, the implementation progress met the expectations of regional and project leads.
However, the success of further scaling of the digital platforms has varied. In many cases, the initial usage of the digital platform was limited to single localities or organisations, with wider scaling of the project continuing at a slower pace.
Tools for successful implementation
The evaluation identified several characteristics of health and social care organisations which supported successful implementation of remote monitoring.
Nearly all projects reported that engagement with local organisations was a key enabler to successful project delivery. Engagement was often facilitated by a clinical digital champion (a role that was often supported by implementation funding) who was responsible for approaching target organisations to demonstrate the value of the digital platforms. Regular engagement with stakeholders, for example via working groups, was also deemed important by a number of projects.
Effective leadership of the project team was highlighted as another key component of successful project delivery. Specifically, multi-disciplinary leadership teams (including those in management, clinical and operational roles), which were trusted by senior management to make decisions independently, were seen as crucial to implementing the project within the short timeframes of the programme. Where projects experienced challenges in delivery, this was sometimes attributed by interviewees to a lack of clear leadership.
Despite the programme ambition to align with improvement science, this was not a commonly discussed theme during interviews. This alignment could potentially help with both engagement and leadership.
Barriers to delivery
A range of barriers relating to both technological and human factors were reported by projects.
To address these barriers, projects often relied on implementation funding, enabled by the flexibility of the funding model adopted by the RSP.
An example of this was that in organisations with low levels of self-reported digital maturity, projects reported that significant investment was required to train and secure engagement from staff. In some cases, it was identified that there was a lack of awareness of the resources provided by the NIC to support projects in these tasks. For example, data sharing agreement templates – which for some projects were identified as a barrier – were made available on the Innovation Collaborative workspace on the FutureNHS collaboration platform for the duration of the programme.
Despite the challenges presented by a lack of self-reported digital maturity, projects also expressed that the pre-existing presence of legacy digital systems was not necessarily beneficial either. In these cases, projects described having to dedicate additional time and resources to convince organisations to switch to the new system. This challenge was particularly salient where systems were not interoperable, necessitating organisations to switch between multiple systems.
Securing staff engagement was also identified as a challenge by some projects. The perceived usability and effectiveness of the digital platforms by staff were mixed. In some cases, where digital platforms were not felt to be delivering their promised functionality, staff raised concerns about increased workload and were subsequently more likely to disengage from the project.
Effects of regional devolution
The highly devolved design of the RSP meant that programme resources were directed to those projects which were considered by regional stakeholders to be most capable of success.
This approach is reflected in the highly varied activities, condition focus and technology solutions used across the 24 projects. Many interviewees from organisations overseeing projects reported that this local tailoring had helped engage local organisations, and this had driven rapid implementation.
However, the extent to which project teams were devolved from the central programme, and the large role of the regional organisations, appeared to have caused some inefficiencies. For example, there were reports of delays to funding being received by some of the project teams, despite it having been released in a timely manner by NHSX. Furthermore, ensuring successful communication across the system was found to be challenging, which may have contributed to some projects being unaware of the full national support offer.
Awareness of available support
Projects that engaged with the full range of support felt it had increased their chances of success.
The key areas of support mentioned by projects were the implementation and licence funding, the opportunities for collaboration provided by the programme and the Spark DPS. However, some of the organisations interviewed were unaware of the support offers available outside of the RSP funding. Therefore, ensuring there is awareness of the offer across all tiers of the programme, especially project-level teams, is advised.
The implementation funding was the most recognised area of support and was viewed as central to scaling remote monitoring. Indeed, a strong theme across interviews was that without this aspect of the programme, no significant progress would have been made. The license funding was also viewed as valuable, although concerns were expressed about how to identify future funding sources for this once the initial agreement is finished.
Among those who had accessed it, the contribution the NIC support made to enabling collaboration was particularly noted, and this helped kick-start some projects. Given this feedback, the ‘Collaborative’ model is certainly worth considering in future large-scale programmes (particularly where the wider context is less challenging, in theory leaving projects more time to engage with the support).
Finally, the Spark DPS, which was developed by NHSX to enable rapid procurement of digital platforms, had limited uptake by projects. Those who used the Spark Dynamic Purchasing System (DPS) reported that it gave them confidence in the digital platforms being implemented. However, others reported that greater curation of the digital platforms listed on the Spark DPS would have increased how useful it was to projects, especially given the short timescales of the programme. As a result of these challenges, many projects used an alternative procurement approach other than Spark DPS, or procured solutions directly.
The programme should also be encouraged by the variety of short-term benefits that were evidenced by both health and social care organisations.
Benefits for patients, service users, staff and local systems of health and care were evident across most projects. The evidence we have collected is mostly early-stage and qualitative, but this is to be expected given the timing of the evaluation. However, several projects were able to offer more robust data (drawing on an early quantitative analysis). This offers encouraging signs as to the potential longer-term benefits of the RSP.
Although robust empirical evidence of the impact of remote monitoring on reducing pressures on secondary care is not widely available, there are encouraging early signs that the projects are realising the intended benefits. One interviewee described how a project had led to “huge strides [in care homes], in terms of the number of patients they’re keeping in the home, the number of conveyances to hospital going down, the number of avoidable admissions almost being zero, which is huge for some homes”. Based on this insight, coupled with evidence of longer-term benefits from the evidence scan, it is anticipated that these positive impacts will be confirmed in time, assuming continued progress with delivery. Indeed, several of the project teams interviewed have plans in place to conduct robust analyses once sufficient data is available later this year. In the longer term, reduced pressures on secondary care should lead to improved capacity, productivity, and efficiency across health care services. Few projects were able to evidence reductions in pressure in secondary care at the time of the evaluation. This may be because, as a longer-term benefit, it is more challenging to evidence at such an early point in the intervention.
Across the evaluation, we have gathered encouraging evidence that the programme has contributed to improved communication and collaboration between staff in different organisations. We have found that the technology itself has acted as a tool to improve the efficiency of communication between services. One interviewee highlighted how this had saved “clinical time … they don’t have to do another phone call or another email [and] communication wise it’s probably saved – even for our admin teams – up to an hour a day”. It should be noted that although these projects have increased the efficiency of communication between staff, they may also have the potential to reduce the amount of contact that staff have with each other in the long term. Mitigating the risk of reduced communication between staff, some interviewees suggested that equitable access to information reinforces the interconnectedness of health and social care services, creating feelings of inclusivity that will lead to collaboration.
Emergent evidence on the impact of the technology solutions on resourcing and the workforce impact is mixed. In some cases, we heard that implementing the technologies was time-consuming, while other interviewees reported that it had been a timesaver in the longer term. Increased resource requirements as a finding warrants further evaluation.
In the most mature projects, substantial qualitative evidence was provided which offers some confidence that the technology solutions were beginning to have positive effects on patients’ and service users’ understanding of their condition. We also heard cases of people empowered through the new technology solution to take a more active role in the management of their condition. It was reported by projects, for example, that the technologies increased patients’ and service users’ familiarity with their own health status, which is a predictor of improved healthcare behaviours and outcomes.
However, the evaluation has also observed that because patients and service users were supported to stay in their homes to aid pandemic resilience, rather than to support improved activation levels as a primary outcome, no formal expectation has been made for projects to measure these outcomes.
Future plans and risks
Projects have plans to sustain their activities, but there are risks to these which may require programme action.
Interviewees described two main models for sustaining and building on the progress made during the life of the programme. The first is through building on the learning project teams have gathered over the past few months and applying this (and the technology) in other service areas. Others spoke of building longer-term approaches to using remote monitoring technologies in their organisational strategies.
The second approach, which was described by most projects, is to identify further financial support to continue the work over a longer period, as part of their sustainability plan. This demonstrates that the case for investment in these technologies needs to be made as part of organisations’ core budgets. Towards the end of the evaluation timescales, there were examples of this investment being secured by projects. It also shows the importance of continued evaluation so that the longer-term benefits can be adequately evidenced to help form the value propositions.
The evaluation offers a range of recommendations at two levels: the policy/strategic level (senior NHSX decision-makers); and the programme level (the programme team). They are set out below in short form, and in longer form in the main report.
Strategic and policy level
NHSX should endeavour to publicise future programme opportunities to ensure the sector is aware of upcoming funding opportunities.
Several aspects of this programme’s design should be considered for inclusion in future technology uptake initiatives, including:
- implementation funding with flexibility on how it is used
- using regional teams to target projects based on local need
- a focus on supporting collaboration across organisations
- building capacity of stakeholders to understand the relative benefits of digital tools available.
A formal communications strategy should be considered for future programmes. The objectives of the strategy should include ensuring that organisations that take part in the programme are aware of any programme support made available.
The evaluation findings endorse NHSX’s planned support for social care providers focused on improving digital and technological maturity in this sector. Individual programmes of support on specific innovations need to be joined up (most probably at the ICS level) to create a coherent offer to the sector.
For future NHSX programmes, an evaluation strategy should be prepared routinely as a core programme foundation document. Common outcomes should be specified, data collection expectations made clear and built into projects’ delivery plans, and the role of evaluation in sustaining activity beyond the programme life should be included.
A phased approach to funding should be adopted for future programmes of a similar nature. This could include a three-month seed funding phase for project scoping and set-up, followed by a main stage where the bulk of implementation and licence funding is distributed.
NHSX should ensure that patient input is gathered in the design phase of new programmes and that, where appropriate, programme objectives focused on the potential value of the programme to patients’ and service users' wellbeing, health, or levels of activation are prioritised.
NHSX should take stock of the different forms of evaluative work currently underway across the programme, to ensure that longer-term evaluation needs are identified and addressed.
The programme team should continue to collect and share emerging results from the RSP’s project-level evaluations through the FutureNHS website or other appropriate channel.
The programme team should continue to review the quality of benefits data, and the benefits register in conjunction with any future impact evaluation work planned, to ensure data is collected against anticipated long-term benefits of the programme.
Going forward, regional and sub-regional stakeholders will benefit from clearer guidance on how digital platforms should be assessed during procurement. Alternatively, further support in the form of expert resources to undertake an in-depth assessment of a supplier’s value proposition as part of the procurement process should be provided.
Reporting requirements and data requests for future initiatives should be commensurate with the progress that can reasonably be expected from projects funded.
For future programmes focussing on adoption and rapid scaling of remote monitoring, the Digital Technology Assessment Criteria (DTAC) should be used for baseline assessment, coupled with a local specification. If a framework or DPS is established in the future, consideration should be given to the ability to award directly or narrow down suppliers in an agile way.
By referring to sustainability plans provided by projects, the programme team should identify common dependencies that may affect the sustainability of projects, and whether any action from the programme or regions may help.
NHSX should ensure that projects develop and share plans to apply improvement science methods. The evaluation identified that the rapid pace of implementation was felt by some projects to prevent the use of these methods. Ensuring improvement science is embedded in plans from an early stage may assist in overcoming this barrier.