Our national Supported People EPR usability survey
Following the launch of the second phase of the national EPR usability survey on 27 October, in a joint blog post Dermot Ryan describes how the survey is part of a wider programme of support to help organisations and local systems achieve What Good Looks Like; and, former CIO of Salford Royal Rachel Dunscombe describes the impact the survey had on people and processes when it was first conducted at the trust in 2016.
Dermot Ryan: The big picture
We’re inviting clinicians in acute settings to share their experience of using electronic patient records in practice. It’s the second phase of our national Supported People EPR usability survey.
The survey is part of our programme of assessments covering the seven success measures for our What Good Looks Like framework which we published earlier this year, setting out a common vision for good digital practice. The assessments will support local systems and organisations to baseline their level of digital maturity and identify opportunities to prioritise local digital investment.
Aligned to our success measure for “Supported people”, the EPR usability survey will help organisations measure if current digital and data tools and systems are fit for purpose to support staff to do their jobs well.
This is, by some margin, the biggest EPR usability survey ever conducted, adding to what we are learning from the over 6,000 clinicians from mental health, community and ambulance services who completed the survey during the first phase this summer.
At the national level, the results will help us identify where there may be common challenges that we can help with. This is likely to include discussing directly with EPR suppliers how they might improve or upgrade their products to address unmet needs. This is part of our ongoing work to support organisations and local systems to achieve What Good Looks Like.
At the organisational level, the survey will provide data and insights that will help you understand how to improve the clinician’s experience. All trusts achieving a minimum response rate of 30 will get a bespoke dashboard showing their results, and further support to help them interpret and act on it.
I hope that you’ll encourage your colleagues to take just eight minutes out of their busy schedules to complete this survey. My colleague Rachel Dunscombe describes some of the clear benefits to organisations of undertaking this kind of assessment.
Rachel Dunscombe: The trust perspective
I’ve got a special interest in this project for three reasons:
First, because as Chief Executive of the NHS Digital Academy, it’s great to see a national agency putting user experience and all we can learn from it centre-stage in its work – it’s a particularly important learning opportunity to support the What Good Looks Like pillars.
Second, because as an associate of KLAS (NHSX’s research partner for this work), I’ve been personally involved in interpreting the international body of research this survey is built on and have seen the benefits it can bring to organisations around the world.
Third, because as a former CIO at Salford Royal, the first trust in the country to use this survey, I know the impact it has had on our hospital and I’m proud to see it happening on a national scale.
The Salford experience
When I first agreed to participate in the Arch Collaborative – which is a coalition of international health organisations that KLAS has worked with to understand and track EPR usability for over a decade – I did so through the prism of Salford Royal’s strong focus on quality improvement.
I suppose it can feel a bit intimidating ‘putting yourself out there’, particularly so as we were the first trust in the country to participate in this research, but my view was that whether we came top or bottom of the pile, two things will happen: first, I would know where I can act on areas of improvement, and second, I would know where I can celebrate and share the good things.
And that’s exactly what we found. Some areas of our practice ranked very well, others less so. The research exposed the clinical specialities that were not receiving a good enough experience from the EPR, opening up opportunities for us to drill down to learn what was going wrong and what we could do about it.
Looking back, it helped that we could do this from a place of safety, knowing that our results wouldn’t be published by name for the world to see, and that we could therefore act with confidence within the Arch Collaborative, sharing our own beacons of good practice with others while learning from the best elsewhere around the world. So this is really data that starts conversations – with our own staff and with counterparts around the world.
Key lessons learnt
The survey also allowed me to understand where I should place each pound on improving the EPR experience with a proven methodology and science. Indeed, when we delved into the statistics, it quickly became apparent where there was more work to do: on improving our governance, on delivering more targeted engagement with particular specialities to help them get more out of our system, and on understanding and acting on the specific experiences of the community health professionals within our organisation.
One of the biggest lessons for me was that you can make the vast majority of EPR systems deliver an outstanding user experience; you can also make even the shiniest, most advanced system fall flat if you don’t invest time and effort in the right things. And as we found in Salford (echoed in KLAS’s international findings too), the majority of the factors that make an EPR system a success fall under the organisation’s control.
Optimising resources
Having this research done on a national scale gives us all the chance to contribute to something quite special. Every day, NHS professionals perform millions of individual keystrokes to operate the IT systems that make our health services run efficiently. These processes consume hundreds of thousands of hours of clinical time each year.
What I particularly like about it is that this is the NHS helping itself, creating an environment where we can share good practice and build with confidence on our own experiences and that of others. At Salford, the insights we gained were integral to our work as a Global Digital Exemplar, informing our direction of travel and priorities.
So my advice is to embrace this opportunity. It’s a brilliant chance to celebrate what you do well, identify where you can learn from others and apply a proven improvement science to support your clinicians to the full. And the difference this can make, as we saw in Salford, is enormous.
The NHSX national usability survey is now open to all clinicians working in acute hospital settings.