ideas are brewing.

not quite...

← Back to work NHS England · 2021 – 2022

Clinical Decision
Support System

Role Principal Product Designer
Scope Research · Design · Delivery
Team 4 designers · 4 PMs · 30+ engineers
Platform Web · Clinical workstations
Clinical interface — decision support dashboard

AI-assisted diagnostics for frontline clinicians

NHS England processes over 1.2 million patient contacts every 24 hours. Frontline clinicians — GPs, A&E doctors, nurses — make hundreds of consequential decisions each shift, often under significant time pressure and with incomplete information. Missed diagnoses and delayed referrals remain a major challenge across the system.

I led the design of a clinical decision support system that uses AI to surface relevant diagnostic pathways, flag potential risk indicators, and recommend evidence-based next steps — all integrated into the existing clinical workflow rather than sitting alongside it as a separate tool.

23%
Reduction in missed high-risk referrals in pilot cohort
91%
Clinician confidence score in AI-generated pathway suggestions
8 min
Average time saved per complex consultation in usability trials

High stakes, high variance, zero margin for ambiguity

Clinical environments are unlike any other design context. Users range from newly qualified junior doctors to consultants with 30 years of experience. Cognitive load is extreme. Interruptions are constant. And every decision can have irreversible consequences for a patient.

Previous attempts at clinical AI tooling in the NHS had failed for predictable reasons: they added steps rather than removing them, they surfaced too many alerts (causing alert fatigue), and they didn't account for the diversity of clinical judgment. Our research found that experienced clinicians were actively bypassing decision support tools because they felt patronising. Any new system would need to respect clinical expertise, not attempt to replace it.

Contextual research — clinical observation sessions & workflow mapping

Designing with clinicians, not for them

The design process was deeply embedded in clinical settings. I spent over 120 hours observing consultations across three NHS trusts — A&E departments, GP surgeries, and acute wards — before a single wireframe was produced. This was essential: the workflow nuances that emerged could not have been captured through interviews alone.

Co-design workshops with clinicians across experience levels produced a set of non-negotiable principles: the system must never block a clinical action, AI suggestions must be immediately dismissable, every flag must include its reasoning, and the visual design must not compete for attention with the patient record.

We built and tested eight prototype iterations over six months, using simulation lab sessions with standardised patient scenarios. Each round surfaced critical interaction failures — particularly around how urgency was communicated and how the system behaved when clinical judgment disagreed with the AI recommendation. The final model treats the AI as a peer offering a second opinion, not an authority issuing instructions.

A tool clinicians trust enough to rely on

The system launched as a pilot across four NHS trusts and demonstrated a 23% reduction in missed high-risk referrals within the first six months. Crucially, voluntary adoption was high — 84% of clinicians in the pilot cohort chose to use it regularly, compared to 30–40% adoption rates typical of mandated NHS digital tools.

The accessibility framework and alert severity design language developed for this project were subsequently adopted by NHS Digital as recommended standards for AI-assisted clinical tools across the health service. The co-design methodology is now published as an NHS guidance document for future clinical AI deployments.

Back to the start

JP Morgan Chase — Wealth Platform →