Click here to join our community!
Regulation of Health Technologies: an art of a science?
Presented by Prof Alastair Denniston, Director of CERSI-AI
Professor of Regulatory Science and Innovation, University of Birmingham
Monday 14th April 2025
What is it?
We are used to the idea that regulators ask manufacturers for evidence relating to their product before it can be put on the market. But what about the other way round? Is it reasonable for manufacturers – or society more generally – to ask regulators to provide evidence that their approaches are appropriate for the technology being evaluated: effective and safe, but also proportionate and efficient.
The concept of ‘regulatory science’ underscores the need for using scientific methodology to innovate, evaluate, and iterate our regulatory frameworks to ensure they are based on evidence rather than narrative or politics.
Why should I care?
If we want to unlock the potential benefits of AI health technologies safely and at speed, we need our regulatory systems to be evidence-based and underpinned by scientific methodology. The UK Government has recently awarded funding to CERSI-AI, a national centre of excellence to support regulatory science and innovation in AI and Digital Health Technologies.
Does it impact healthcare?
Yes, we need to ensure that our regulatory systems are smart enough to meet the challenges and opportunities emerging technologies like AI, enabling beneficial innovation and early adoption that benefits patients whilst also protecting against harm, and ensuring trust.
Useful links:
https://www.cersi-ai.org/
https://www.digitalregulations.innovation.nhs.uk/
https://www.digitalhealthincubator.ai/webinars/safe
NHS England - Procurement and Innovation
Presented by Charlie Stephens, NHS England’s Head of Commercial for IRLSS (Innovation, Research, Life Science Strategies) and MedTech.
Monday 10th of March 2025
What is it?
My talk is about innovation and procurement, how the two are not always natural bedfellows, thoughts on how to procure innovative solutions and use innovative procurement techniques to achieve best value.
Why should I care?
There has never been a more opportune time to reconsider how we procure for innovation, because on 24 February 2025, the rules that shape how public bodies buy goods and services changed. The Procurement Act 2023 was introduced with the aim of improving and streamlining the way procurement is done, it is intended to be more flexible for prospective suppliers of all sizes, particularly small businesses, start-ups and social enterprises. These are all positive developments which could unlock the benefits of innovation for the NHS, however to reap these potential rewards we need to change the way we procure accordingly.
Does it impact healthcare?
The impact on healthcare is potentially huge and positive, I look forward discussing these opportunities during the webinar.
Brave AI Deployment in the South West, England
Presented by Dr Matthew Dolman, Complex Care GP in Somerset
Thursday 13th February 2025
What is it?
Brave AI by Bering Ltd. is an AI tool that analyses primary care data to predict patients at risk of unplanned hospital admissions, enabling proactive and personalised health and care interventions.
Why should I care?
It helps health and care professionals provide early, personalised care, reducing unplanned hospital admissions, emergency visits, and strain on NHS resources.
Does it impact healthcare?
Yes, it has been shown to reduce falls, ambulance callouts, and emergency visits in pilot programmes, improving patient outcomes and healthcare efficiency. It was piloted clinically in Somerset with great success and is now being used clinically in several sites across the region. It is also technically ready for use in 21 sites across the South West of England.
Useful links:
https://leap-hub.ac.uk/training-courses/
Perspectives on AI use in General Practice
15th January 2025
Presented by Dr Tafsir Ahmed
What are the uses of AI in General Practice?
AI is used in GP to improve administrative efficiency, for diagnosis and treatment planning and for information/clinical processing with generative AI and large language models (LLMs).
Does it impact healthcare?
AI is already widely utilised in GP - with one in five GPs using Chat GPT in clinical practice (BMJ survey, 2022), and one in four Doctors using AI regularly in clinical practice (The Alan Turing Institute and GMC, 2024). AI is implicated as a potential solution to the NHS’s issues in Lord Darzi’s 2024 report and likely to take a pivotal role in the upcoming NHS 10-year plan.
Why should I care?
The Care Quality Commission (CQC) are currently undertaking a scoping project for stakeholder perspectives on AI use in GP to inform our regulatory approach. Please join the webinar to be informed of the current landscape of AI in GP and the provisional themes they have taken from stakeholder conversations. Most importantly, have your chance to be a part of discussions as stakeholders to shape the future of AI deployment, use and concurrent regulatory and safety approaches in clinical practice.
AI Training for the Healthcare Workforce Presented by George Onisiforou, NHS England
NHS Fellowship in Clinical AI Presented by Beatrix Fletcher, Guys and St Thomas' NHS Foundation Trust
MSc AI Implementation (Healthcare) launching in 2025/26 Presented by Dr Jeffry Hogg, Dr Slava Jenkin and Dr Anirban Dutta, University of Birmingham
10th December 2024
What AI training needs are there for the healthcare workforce?
An NHS England report from 2022 highlighted five different roles that different parts of the healthcare workforce will take around AI. One key need raised was for ‘Embedders’, who have the capabilities to lead AI implementation efforts in their place of work.
Why should I care?
The University of Birmingham is launching a new educational programme to expand the limited opportunities for people from varied backgrounds to gain the skills and knowledge required for these Embedder roles.
Does it impact healthcare?
Healthcare policy makers and academics have pointed to workforce readiness as a key barrier to responsible AI innovation for several years. Embedders often start their work on AI implementation projects two years or more before it affects care, so the impact of this training need will persist even after it is met.
This 2-part webinar will begin with an overview of the NHS workforce needs for AI use from one of the authors of the 2022 NHS England report. The second part will focus on training opportunities to meet these needs with a focus on new courses from faculty at the University of Birmingham.
Assuring Artificial Intelligence in Healthcare
21st November 2024
Presented by: Anusha Jose on behalf of Dr Adam Byfield
What is it?
An overview of NHS England’s AI Quality Community of Practice (AIQ CoP) and case studies demonstrating their work.
Why should I care?
Traditional assurance techniques often don’t work for AI while industry standard AI assurance techniques often don’t work for healthcare. As AI is now appearing everywhere throughout healthcare organisations and systems, it is more important than ever that this technology is sufficiently assured before use.
Does it impact healthcare?
Absolutely! The AIQ CoP exists specifically to supporting and encourage the detailed, technical assurance of a wide range of healthcare AI, both administrative and clinical.
8th of October 2024
Presented by: Dr Lucia De Santis, Lucy Gregory, Professor Carl Macrae, Dr Joe Alderman, Professor Alastair Denniston, Rebecca Boffa, Martin Nwosu and Russel Pearson, Gemma Warren, and Moritz Flockenhaus,
Highlights include:
Talks and discussions with the AI and digital regulations service (collaboration between
NICE, MHRA, HRA & CQC)
Evaluating an AI model for your population
Insights from an NHSE commissioned review of AI
Locally tailored economic evaluation of clinical AI
Safety monitoring in automated systems
Recording, presentation slides and agenda are available here: Using Artificial Intelligence (AI) Responsibly 8th Oct 2024 – HDR UK Midlands
Why is the Intended Purpose Statement critical, and what should it contain?
10th of September 2024
Presented by: Dr Russel Pearson
What is and Intended Purpose Statement?
An intended purpose statement is a document which clearly sets out what a medical device is approved to do, for who and in what settings. This is just as important for medical devices that use AI as any other medical devices.
Why should I care?
AI technologies are developed and tested for very specific tasks. Even if we use them for similar but different tasks, they may not perform as well as expected bringing risks of error or harm.
Does it impact healthcare?
Yes. It has a big impact on AI companies as it is the foundation of their work to build their products and have them validated. For those providing healthcare it is also really important so that they understand how AI can be used in the care they provide.
Recording unavailable.
For more information visit: Crafting an intended purpose in the context of software as a medical device (SaMD) - GOV.UK and Intended Use — Hardian Health
Clinical AI monitoring: Medical algorithmic audit case studies
24th June 2024
Presented by: Dr Aditya (Adi) Kale
What is it?
Medical algorithmic audits are a particular way in which a healthcare provider can check that the clinical AI tools that they use continue to work well. Ideally, this process is done collaboratively with input from both the company that develops the AI tools, and the organisation delivering care (e.g. a hospital).
Why should I care?
One of the differences between clinical artificial intelligence and other health technologies is its tendency to not perform the same in different clinical settings. Even in a single setting, the performance tends to change over time. That means that if robust monitoring approaches are not in place, without us realising, AI-enabled healthcare may not actually be helping patients in the way we expect.
Does it impact healthcare?
Yes, there are many AI-enabled healthcare pathways serving patients right now. The only way that we can know that they are doing a good job is by regular monitoring of their performance and trying to solve any problems that we find.
Defining AI Safety for Healthcare and Beyond: Unique Risks and Considerations
14th May 2024
Presented by Professor Ibrahim Habli
As Artificial Intelligence (AI) promises to reshape healthcare, a clear understanding of AI safety is essential. This talk proposes a comprehensive definition of AI safety within the healthcare domain, examining how challenges like under-specificity and opacity can introduce new risks for patients. It advocates for proactive safety measures and transparent risk management to ensure AI's potential is realised without compromising patient safety and undermining trust in the healthcare system.
Recording available on request: ai.incubator@uhb.nhs.uk
A Primer on NHS SBS Procurement Frameworks and what they mean for clinical AI adoption
18th April 2024
Presented by Hamza Hussain
What are procurement frameworks?
Procurement frameworks are agreements that enable NHS organisations to buy services and goods from suppliers that have met markers of safety and usefulness.
Why should I care?
Of the estimated £30 billion the NHS spends on goods and services from other companies each year, around £25 billion is spent through procurement frameworks. It is much easier for trusts and ICBs to adopt clinical AI products if they have been listed on a procurement framework and also easier for manufacturers to sell products when listed on a framework without having to go through a full open procurement process.
Does it impact healthcare?
Yes. Procurement frameworks are in line with procurement legislation ensuring any Healthcare projects are legally compliant. It can also have a positive impact on the timelines of clinical projects being implemented and support the adoption the most promising clinical AI products.
Recording available on request: ai.incubator@uhb.nhs.uk
From data subjects to design partners: exploring participatory principles in health AI research.
27th March 2024
A collaborative event with Data Science for Health Equity (DSxHE) as part of the Alan Turing Institute's AI UK fringe.
This interactive online session brought together researchers from the STANDING Together project and a few special guests for a free-flowing discussion delving into the intricacies of participatory practices in healthcare AI research. What works? What doesn’t? And when research budgets are tight and time is limited, why should this be a priority?
How to get regulatory approval for an LLM-enabled medical device
18th March 2024
Presented by Dr Hugh Harvey
What is an LLM?
Large language models, or LLMs, are a type of artificial intelligence that have taken in very large amounts of text to allow them to make new text by predicting sequences of words. Examples include ChatGPT and Bard.
Why should I care?
In a short time, LLMs have already become part of many people’s routine work. Using LLMs has often improved the speed people can do certain tasks at and sometimes how well they can do them.
Does it impact healthcare?
Not yet. There has been lots of interest and investment in getting LLMs to improve healthcare, but it has not been done before and people responsible for their use will have to deal with new challenges, including regulation.
Or email us at ai.incubator@uhb.nhs.uk