NHS App - test result redesign

Making test results accessible and understandable for all patients

Skill Areas

  • Service Design

  • Design Research

  • Design Sprint Facilitation

  • Prototyping & Testing

  • Accessibility & Usability Testing

  • Strategic Roadmapping

Background

NHS England wanted to improve the patient experience of accessing and understanding test results through the NHS App. Many patients faced difficulties with their results, causing delays in care, stress to patients and adding pressure to frontline workers.

Output and outcome

  • 9 rounds of testing

  • Redesigned UI for better navigation & understanding

  • New features like test history graphs, range & support links

  • 47% increase in repeat visits to redesigned screens & 85% positive satisfaction rate

  • £9 million savings (NHS Benefits team) by reducing clinical follow-ups


Project intro

Project length: 14 months (2023/2024)

Client: NHS England (National Health Service)

Team composition: 18 members, including:

  • 4 User-Centered Designers covering content, research, service design, product

  • Business analysts, delivery manager, product owner, developers, data analysts, QAs

Methodology: Agile, with iterative sprints

Tools: Figma, Miro, Jira, Qualtrics, Confluence

My role:
As a Senior Service Designer and Researcher, responsibilities included:

  • Conducting both qual and quant research

  • Mapping user journeys and identifying pain points

  • Leading workshops to co-create strategy and design ideation

  • Facilitating service blueprinting for end-to-end delivery

  • Managing cross-functional collaboration with IT providers and other stakeholders

  • Track and manage KPIs, feedback loops and research repository

Challenges

Patients struggled with interpreting their test results on the NHS App, which led to unnecessary anxiety and follow-up appointments. Challenges included:

  • Inconsistent and unclear patient experience due to many factors, including complex UX/UI in the App

  • Diverse patient needs and digital literacy levels

  • Complex stakeholder ecosystem (clinicians, IT providers, policy, government, etc.)

  • Limited transparency between legacy systems and third-party providers

Fig: Test result screen at start of project

Approach

Discovery:

  • Conducted interviews and surveys with patients, clinicians, and stakeholders

  • Mapped user journeys and personas to identify patient pain points and opportunities for improvement

Ideation:

  • Facilitated workshops using design thinking frameworks to co-create potential solutions

  • Used prioritisation methods to identify the most critical features

Implementation & iteration:

  • Developed wireframes and high-fidelity prototypes

  • Usability and A/B testing informed continuous iteration of the test results screens

  • Integrated feedback loops to ensure ongoing refinement post-launch

Strategic integration:

  • Ran pilot programs with partners and potential partners

  • Used patient feedback to influence future digital initiatives and policy recommendations

Fig: Iterative design process


Key methods used

  • During the pre-research phase, I established recruitment criteria, carefully selecting participants that represented the app's diverse user base. I sourced participants through various channels, including user panels, networks and communities, recruitment agencies, and bulletin boards. Once the participants were in place, I collaborated with the team to define research goals and hypotheses for each round, ensuring we had a clear focus on the areas of the app we wanted to evaluate. I also created discussion guides tailored to these goals and when necessary worked closely with the product and content designer to prepare prototypes, user flows, and specific tasks for participants to complete.

    Before each session of user testing, interviews, and accessibility testing, I conducted tech calls with participants to ensure a smooth testing experience, confirming that all technology was set up correctly.

  • I conducted nine rounds of usability and accessibility testing to ensure the app met user needs. I facilitated sessions and observed user interactions to gather insights.

    My testing approach was tailored to the specific usability and accessibility criteria for each round. I included participants with a range of disabilities, such as visual, cognitive, and hearing impairments, as well as those with long-term conditions and those who use assistive technology. I also made sure we tested with a mix of participants with varying levels of experience in using testing and results services.

  • I conducted tree testing using Optimal Workshop to evaluate the effectiveness of different content and architecture in helping patients navigate to their test results. With this test goal was to identify the content and architecture that best facilitated navigation and understanding for users.

    In one case, we tested navigation to GP and hospital ordered test results with 164 participants, and the results revealed that one content style significantly outperformed the others in helping participants to locate their test results and understand their navigation path. This quantitative data was further validated through usability testing rounds, where the same content type proved to be the most effective.

  • I conducted 10 surveys using Qualtrics, targeting patients, GPs, and other clinical staff. These surveys were distributed through various channels such as pop-ups within the App, email links, and primary care bulletins. The in-App surveys were particularly effective, often receiving up to 2000 responses within a few days, allowing us to gather substantial quantitative data on the service.

    Also, I ran pre and post-release surveys to gauge general usability and measure changes in user experience before and after updates. We also used in-App banners with survey links to collect feedback on improvements after releasing MVPs and iterations. The feedback from these surveys was triangulated with findings from other research methods and assessed during prioritization and ideation sessions, ensuring a comprehensive understanding of user needs and the impact of our design changes.

  • Facilitated workshops with NHS App teams, users, clinicians, and stakeholders to collaboratively design solutions that balanced user needs with technical constraints.

  • For qualitative research, I led debriefing sessions immediately after each test or interview, where notetakers and observers shared initial impressions and observations. I then I analyzed the findings using affinity mapping and rainbow charts to identify patterns and insights from the notes.

    For quantitative research, I used Qualtrics and Mural to organize and categorize data, calculate key statistics, and visualize correlations between variables. I identified trends and conducted statistical and inferential analyses using additional tools as needed. This helped me present clear, actionable insights and validate our hypotheses.

    I presented these findings to key stakeholders, offering recommendations for improvements in design, content, and navigation. I also supported ideation sessions to drive user-centered improvements, to make the NHS App more accessible and user-friendly based on research.

  • I maintained a user needs repository to ensure we were tracking and prioritizing robust user needs, not just usability concerns or solution-focused requirements. These needs served as a benchmark for measuring success, aligning with KPIs, and tracking progress over time. We used the repository for prioritization sessions, scenario planning, risk analysis, and strategic foresight, ensuring that user needs were at the center of decision-making. It also helped us evaluate whether features and functionalities were effectively addressing those needs, ensuring that the app's development remained user-focused.

  • I designed artifacts to define and illustrate findings, helping us understand the system and map user journeys. This included creating personas and journey maps to visualize user experiences.

    Mapped the entire test results journey to align the digital experience with healthcare workflows, ensuring seamless integration between patients and clinicians.

    Created detailed system maps that highlighted touchpoints across multiple platforms, including GP and hospital systems, for a cohesive user experience.

  • Used insights from research to influence policy changes and help scale the solution across NHS regions.

  • Our process was fast-paced, with many ideas generated. I collaborated with the UCD team to prioritize these ideas based on strategic goals, UX and tech effort, clinical and tech risks, user needs, the risk of not implementing, and the criticality of pain points. This approach helped us streamline next steps and define the project roadmap. We conducted this prioritization through a series of workshops.


Process visuals

Research findings

During the project, we built a comprehensive User Needs Repository and Insights Repository to track and organise key findings. Through a combination of discovery research and user testing, we were able to validate assumptions and generate actionable insights that informed our design decisions.

  • Personas: We developed detailed personas representing patient needs for different conditions and access needs, helping us ensure that our designs were inclusive and relevant across different user groups.

  • As-is journeys & blueprints: We mapped current user journeys to identify pain points and inefficiencies, which later informed the service blueprint for the redesigned test result system.

  • Discovery insights: Our initial research uncovered significant issues around user accessibility, usability, and understanding of test results. These findings were foundational in reshaping the test result experience.

  • User testing: After implementing the initial designs, we conducted usability testing with patients and clinicians, refining the solution based on their feedback. These iterative tests and feedback loops helped us ensure that the final solution was both intuitive and impactful.


Figs: Repository, user needs, scenario with persona


Design and implementation

Key improvements to the test result screens include:

  • Improved navigation through refined labels and user flows based on testing

  • Test result history screen added, allowing users to view results over time on a graph for better interpretation

  • Visual indication bars show where individual results lie within the normal range, using amber and green indicators

  • Lab Tests Online links added to explain what the different tests measure

  • Improved information hierarchy, including result cards with healthcare professional comments and dates

Fig: Change implemented to test result screens


Key takeaways

  • Early user and stakeholder involvement is key: Engaging patients, clinicians, and stakeholders early in the process allowed us to uncover critical issues with accessibility and user comprehension. This helped shape the solution from the start.

  1. Iterative testing leads to better outcomes: The continuous cycle of design, user testing, and refinement allowed us to address real user pain points and make informed adjustments. Testing and iterating often led to a more polished and user-friendly product.

  2. Balancing stakeholder needs is challenging: Working with a large and diverse group of stakeholders, including clinicians, IT teams, third party providers, and policymakers, required clear communication and frequent contact to make sure everyone’s needs were met.

  3. Agility and flexibility matter: In healthcare, adapting the design process and priorities to the changing needs and goals of the NHS helped ensure that the project stayed relevant and responsive. Also, the prototype was constantly changing depending on new insights uncovered and we had to stay agile with this.

  4. Research and design documentation is important: Building a user needs repository and continuously collecting insights throughout the project proved critical for making data-driven decisions. This repository allowed us to advocate for user-centered solutions and establish a strong research foundation. Documenting the reasons behind design changes also was key in managing a fast-evolving product. It gave the team the ability to quickly identify the source of changes, understand why they were made, and be sure we were consistent.

  5. Involve team members in the design process: Bringing developers and other team members into the design process early proved invaluable. Their deep understanding of the systems and technical and policy constraints helped ensure that our designs were not only innovative but also feasible. Their insights often sparked creative solutions that improved the design.

  6. Use the expertise from clinicians and subject matter experts: Clinicians and other subject matter experts play a crucial role in guiding our approach. Their firsthand experience with healthcare helped us design solutions that were both practical and aligned with medical best practices. Collaborating with them early on allowed us to identify opportunities and limitations we wouldn’t have otherwise considered.

Siguiente
Siguiente

NHS Innovation Service