Click here to join our community!
9th December 2025
Presented by Dr Justin Green
What is it?
OpenPredictor (OP) is an AI-powered platform that supports surgical decision-making and preoperative optimisation. It uses routinely collected patient data to predict perioperative risk, prioritise waiting lists, and guide clinicians in preparing patients for surgery. By integrating with hospital systems, OP automates risk stratification and workflow coordination to identify high-risk patients early and allocate resources efficiently.
Why should I care?
Delays in identifying high-risk patients and inconsistent preoperative assessments can lead to cancellations, inefficiencies, and patient harm. With record surgical waiting lists, OpenPredictor uses AI to flag high-risk patients early and enable timely optimisation. This helps reduce late cancellations, improve theatre utilisation, and ensure teams focus where it matters most.
Does it impact healthcare?
Embedding OP into surgical pathways improves patient prioritisation, preparation, and management before elective surgery, aligning with national goals to reduce waiting times and enhance safety. Its success in orthopaedics is now being extended to other specialties, showcasing how explainable AI can be safely integrated into clinical practice for data-driven efficiency.
About the speaker:
Dr Justin Green, MBBS, PGCert MedEd, MBA
Justin is the CEO and co-founder of OPCI, a MedTech start-up leveraging AI for digital triage. A former orthopaedic registrar and clinical data scientist, he is pursuing a PhD in Artificial Intelligence at Newcastle University with a focus on ethics and responsible AI in healthcare. He also co-leads the Musculoskeletal interest group at The Alan Turing Institute and contributes to digital skills initiatives across Northeast England.
11th November 2025
Presented by Dr Sarim Ather, Consultant Musculoskeletal Radiologist, Director of Oxford Clinical AI Research (OxCAIR) and member of the Royal College of Radiologists AI advisory committee.
What is it?
A practical walkthrough of how to monitor AI tools after deployment: risk categorisation, safety metrics, drift detection, reader monitoring, incident handling, and vendor obligations. I’ll map these to the RCR guidance and emerging NHS assurance pathways.
Why should I care?
Most AI risk emerges in real use. Robust surveillance protects patients, reduces operational surprises, and preserves trust. You’ll leave with a checklist, example KPIs, and templates you can adopt immediately.
Does it impact healthcare?
Yes — better surveillance improves diagnostic quality, reduces harm from model drift, and speeds up scaling from pilots to sustained service, aligning clinical, IG, and procurement teams.
22nd October 2025
Presented by Dr Rhidian Bramley, Clinical Lead for AI, Greater Manchester Cancer Alliance
Consultant Radiologist, The Christie NHS Foundation Trust
What is it?
Chest X-ray (CXR) is the most widely used imaging test and the first-line investigation for suspected lung cancer. This talk will explore how artificial intelligence can support CXR interpretation, and how automation in reporting workflows — such as AI triage, prioritisation of worklists, and identifying high confidence normal CXR — can streamline diagnostic pathways and ensure that patients most at risk are seen first.
Why should I care?
Missed cancers and delays in chest X-ray reporting are a major patient-safety concern. With demand increasing and radiology capacity under pressure, AI and automation together can add value not only flagging suspicious findings but also automating the flow of cases, reducing turnaround times and identifying cost savings where human intervention is not required.
Does it impact healthcare?
Yes. Embedding automation into reporting processes directly supports the National Optimal Lung Cancer Pathway target of reporting suspected cancers within 24 hours. By combining AI detection with automated prioritisation and identifying high confidence normal CXR, radiology departments can redesign services for earlier diagnosis, improved efficiency, and reproducible safety. These lessons extend across diagnostic imaging, with implications for the wider adoption of healthcare AI.
17th September 2025
Presented by Jenny Partridge, Innovation Manager, Health Innovation Kent Surrey Sussex
What is it?
Ambient Voice Technology (AVT), also known as ambient scribes, is rapidly gaining traction among healthcare professionals as a tool to record their consultations and assist with record keeping. While over 150 AVT products were once available in the healthcare market, new NHS England regulations introduced in June 2025 narrowed the field to fewer than 10 compliant solutions (although this number is once again on the rise).
Why should I care?
There is widespread interest in using tools to help reduce the admin burden faced by healthcare workers. AVT is being widely trialled across many different healthcare settings, but with its growing adoption comes critical questions: is AVT governed appropriately? How secure is it? What happens to sensitive patient recordings? How easy is it to use?
Health Innovation Kent Surrey Sussex has led two independent evaluations of AVT implementation into health and social care settings. In this webinar, they’ll share key findings, practical insights, and lessons learned from being at the forefront of AVT deployment.
Does it impact healthcare?
Yes – AVT has the potential to significantly reduce cognitive and administrative load, enhance the accuracy and quality of clinical notes, and allow professionals to focus more fully on patient care, ultimately improving patient experience and outcomes
30th July 2025
Presented by Haris Shuaib, CEO of Newton's Tree
As AI becomes increasingly embedded in clinical workflows, hospitals face a growing challenge: how to safely evaluate, monitor, and scale these technologies across complex environments.
This talk explores how dedicated AI platforms can bridge the gap between innovation and clinical trust - enabling robust testing, real-world validation, and continuous post-deployment monitoring.
Drawing on case studies, we’ll examine how AI platforms help move from one-off pilots to sustainable, system-wide adoption - ensuring transparency, safety, and accountability at every stage.
11th June 2025
Presented by Prof Alicja Rudnicka, Professor in Statistical Epidemiology in the Population Health Research Institute (PHRI), City St George’s, University of London
What is it?
The presentation will explore the potential of artificial intelligence (AI)-based algorithms for analysing retinal images to transform screening and secondary prevention using diabetic eye disease as the use case. The evaluation of AI algorithms as medical devices in real-world settings introduces new challenges in study design regarding algorithmic fairness, equitable evaluations and ultimately trustworthiness.
Why should I care?
Evaluations of new AI algorithms license as a medical device are growing. There is a need to understand the importance of contextual factors in real-world evaluations if we want to unlock the potential benefits of AI health technologies safely and at speed. We need rapid evaluations in the intended healthcare setting prior to live/pilot implementation of AI based algorithms for analysing medical images.
Does it impact healthcare?
Yes, we need to ensure that software as a medical device that incorporates AI technology serves the intended healthcare setting for the desired purpose. Using real-world evaluation studies will not only educate relevant stakeholders but it will also build trust in these innovations. Only then can AI be developed to best serve patients and healthcare providers in order to create useful solutions to assist service delivery, accuracy, efficiency and cost-effectiveness.
13th May 2025
Presented by Dr Angus Ramsay, Principal Research Fellow, Department of Behavioural Sciences and Health, University College London
Dr Rachel Lawrence, Qualitative Research Fellow, University College London
Dr Kevin Herbert, Health Economist, University of Cambridge
What is it?
Artificial Intelligence (AI) may support accurate diagnosis, reduced errors, and increased efficiency in radiology services. However, little is known about implementing AI for diagnostics in real-world settings. Our team is analysing implementation, experiences, and impact of AI for chest diagnostics, as deployed in over 60 NHS hospital organisations through NHS England funding initiatives.
Why should I care?
We will present learning on procurement and early deployment of AI tools to support chest diagnostics. We will also discuss how we are currently analysing real world use of these tools, including staff, patient, and carer experiences, and impact on service delivery and resource use.
Does it impact healthcare?
Yes. Our findings demonstrate that procurement and deployment of AI tools are complex processes, influenced by social and technical factors, and requiring significant resources - including time, capacity, and expertise. These are likely to influence any similar efforts to implement AI at scale in NHS diagnostic services.
14th April 2025
Presented by Prof Alastair Denniston, Director of CERSI-AI
Professor of Regulatory Science and Innovation, University of Birmingham
What is it?
We are used to the idea that regulators ask manufacturers for evidence relating to their product before it can be put on the market. But what about the other way round? Is it reasonable for manufacturers – or society more generally – to ask regulators to provide evidence that their approaches are appropriate for the technology being evaluated: effective and safe, but also proportionate and efficient.
The concept of ‘regulatory science’ underscores the need for using scientific methodology to innovate, evaluate, and iterate our regulatory frameworks to ensure they are based on evidence rather than narrative or politics.
Why should I care?
If we want to unlock the potential benefits of AI health technologies safely and at speed, we need our regulatory systems to be evidence-based and underpinned by scientific methodology. The UK Government has recently awarded funding to CERSI-AI, a national centre of excellence to support regulatory science and innovation in AI and Digital Health Technologies.
Does it impact healthcare?
Yes, we need to ensure that our regulatory systems are smart enough to meet the challenges and opportunities emerging technologies like AI, enabling beneficial innovation and early adoption that benefits patients whilst also protecting against harm, and ensuring trust.
10th of March 2025
Presented by Charlie Stephens, NHS England’s Head of Commercial for IRLSS (Innovation, Research, Life Science Strategies) and MedTech.
What is it?
My talk is about innovation and procurement, how the two are not always natural bedfellows, thoughts on how to procure innovative solutions and use innovative procurement techniques to achieve best value.
Why should I care?
There has never been a more opportune time to reconsider how we procure for innovation, because on 24 February 2025, the rules that shape how public bodies buy goods and services changed. The Procurement Act 2023 was introduced with the aim of improving and streamlining the way procurement is done, it is intended to be more flexible for prospective suppliers of all sizes, particularly small businesses, start-ups and social enterprises. These are all positive developments which could unlock the benefits of innovation for the NHS, however to reap these potential rewards we need to change the way we procure accordingly.
Does it impact healthcare?
The impact on healthcare is potentially huge and positive, I look forward discussing these opportunities during the webinar.
13th February 2025
Presented by Dr Matthew Dolman, Complex Care GP in Somerset
What is it?
Brave AI by Bering Ltd. is an AI tool that analyses primary care data to predict patients at risk of unplanned hospital admissions, enabling proactive and personalised health and care interventions.
Why should I care?
It helps health and care professionals provide early, personalised care, reducing unplanned hospital admissions, emergency visits, and strain on NHS resources.
Does it impact healthcare?
Yes, it has been shown to reduce falls, ambulance callouts, and emergency visits in pilot programmes, improving patient outcomes and healthcare efficiency. It was piloted clinically in Somerset with great success and is now being used clinically in several sites across the region. It is also technically ready for use in 21 sites across the South West of England.
15th January 2025
Presented by Dr Tafsir Ahmed National Medical Director's Clinical Fellow to the Care Quality Commission (CQC)
What are the uses of AI in General Practice?
AI is used in GP to improve administrative efficiency, for diagnosis and treatment planning and for information/clinical processing with generative AI and large language models (LLMs).
Does it impact healthcare?
AI is already widely utilised in GP - with one in five GPs using Chat GPT in clinical practice (BMJ survey, 2022), and one in four Doctors using AI regularly in clinical practice (The Alan Turing Institute and GMC, 2024). AI is implicated as a potential solution to the NHS’s issues in Lord Darzi’s 2024 report and likely to take a pivotal role in the upcoming NHS 10-year plan.
Why should I care?
The Care Quality Commission (CQC) are currently undertaking a scoping project for stakeholder perspectives on AI use in GP to inform our regulatory approach. Please join the webinar to be informed of the current landscape of AI in GP and the provisional themes they have taken from stakeholder conversations. Most importantly, have your chance to be a part of discussions as stakeholders to shape the future of AI deployment, use and concurrent regulatory and safety approaches in clinical practice.
Join the incubator community to receive e-mails about upcoming events!
Or email us at ai.incubator@uhb.nhs.uk