Click here to join our community!
10th December 2024
Presented by George Onisiforou, NHS England
NHS Fellowship in Clinical AI Presented by Beatrix Fletcher, Guys and St Thomas' NHS Foundation Trust
MSc AI Implementation (Healthcare) launching in 2025/26 Presented by Dr Jeffry Hogg, Dr Slava Jenkin and Dr Anirban Dutta, University of Birmingham
What AI training needs are there for the healthcare workforce?
An NHS England report from 2022 highlighted five different roles that different parts of the healthcare workforce will take around AI. One key need raised was for ‘Embedders’, who have the capabilities to lead AI implementation efforts in their place of work.
Why should I care?
The University of Birmingham is launching a new educational programme to expand the limited opportunities for people from varied backgrounds to gain the skills and knowledge required for these Embedder roles.
Does it impact healthcare?
Healthcare policy makers and academics have pointed to workforce readiness as a key barrier to responsible AI innovation for several years. Embedders often start their work on AI implementation projects two years or more before it affects care, so the impact of this training need will persist even after it is met.
This 2-part webinar will begin with an overview of the NHS workforce needs for AI use from one of the authors of the 2022 NHS England report. The second part will focus on training opportunities to meet these needs with a focus on new courses from faculty at the University of Birmingham.
21st November 2024
Presented on behalf of Adam Byfield by Anusha Jose, Ben Wallace, and William Poulett, AI Quality Community of Practice, NHS England
What is it?
An overview of NHS England’s AI Quality Community of Practice (AIQ CoP) and case studies demonstrating their work.
Why should I care?
Traditional assurance techniques often don’t work for AI while industry standard AI assurance techniques often don’t work for healthcare. As AI is now appearing everywhere throughout healthcare organisations and systems, it is more important than ever that this technology is sufficiently assured before use.
Does it impact healthcare?
Absolutely! The AIQ CoP exists specifically to supporting and encourage the detailed, technical assurance of a wide range of healthcare AI, both administrative and clinical.
8th of October 2024
Presented by: Dr Lucia De Santis, Lucy Gregory, Professor Carl Macrae, Dr Joe Alderman, Professor Alastair Denniston, Rebecca Boffa, Martin Nwosu and Russel Pearson, Gemma Warren, and Moritz Flockenhaus,
Highlights include:
Talks and discussions with the AI and digital regulations service (collaboration between
NICE, MHRA, HRA & CQC)
Evaluating an AI model for your population
Insights from an NHSE commissioned review of AI
Locally tailored economic evaluation of clinical AI
Safety monitoring in automated systems
Recording, presentation slides and agenda are available here: Using Artificial Intelligence (AI) Responsibly 8th Oct 2024 – HDR UK Midlands
10th of September 2024
Presented by: Dr Russel Pearson
What is and Intended Purpose Statement?
An intended purpose statement is a document which clearly sets out what a medical device is approved to do, for who and in what settings. This is just as important for medical devices that use AI as any other medical devices.
Why should I care?
AI technologies are developed and tested for very specific tasks. Even if we use them for similar but different tasks, they may not perform as well as expected bringing risks of error or harm.
Does it impact healthcare?
Yes. It has a big impact on AI companies as it is the foundation of their work to build their products and have them validated. For those providing healthcare it is also really important so that they understand how AI can be used in the care they provide.
Recording unavailable.
For more information visit: Crafting an intended purpose in the context of software as a medical device (SaMD) - GOV.UK and Intended Use — Hardian Health
5th August 2024
Presented by Dr Kavitha Vimalesvaran, Cardiac Imaging Fellow at Imperial College London
What is it?
We explore the role of AI in healthcare through the lens of clinical trials and research. Unlike traditional pharmaceutical trials, AI clinical trials must address the dynamic nature of AI algorithms and their ability to improve over time with new data. Additionally, these trials require new technology or AI agents to be seamlessly integrated into existing clinical workflows and structures.
Why should I care?
AI has the potential to transform healthcare by enhancing diagnostic accuracy, streamlining workflows, and improving patient outcomes. For healthcare professionals, researchers, and policymakers, understanding how AI is evaluated and implemented through clinical trials is crucial.
Does it impact healthcare?
Yes. The integration of AI in clinical practice has broad implications for healthcare. AI can lead to earlier and more accurate diagnoses, personalised treatment plans, and improved patient monitoring and management. By enhancing the efficiency and effectiveness of healthcare delivery, AI holds the promise of reducing costs and improving patient outcomes across various medical fields. Understanding the successful integration of AI through clinical trials is essential for realising its full potential in transforming healthcare.
24th June 2024
Presented by: Dr Aditya (Adi) Kale, Universit of Birmingham
What is it?
Medical algorithmic audits are a particular way in which a healthcare provider can check that the clinical AI tools that they use continue to work well. Ideally, this process is done collaboratively with input from both the company that develops the AI tools, and the organisation delivering care (e.g. a hospital).
Why should I care?
One of the differences between clinical artificial intelligence and other health technologies is its tendency to not perform the same in different clinical settings. Even in a single setting, the performance tends to change over time. That means that if robust monitoring approaches are not in place, without us realising, AI-enabled healthcare may not actually be helping patients in the way we expect.
Does it impact healthcare?
Yes, there are many AI-enabled healthcare pathways serving patients right now. The only way that we can know that they are doing a good job is by regular monitoring of their performance and trying to solve any problems that we find.
14th May 2024
Presented by Professor Ibrahim Habli
As Artificial Intelligence (AI) promises to reshape healthcare, a clear understanding of AI safety is essential. This talk proposes a comprehensive definition of AI safety within the healthcare domain, examining how challenges like under-specificity and opacity can introduce new risks for patients. It advocates for proactive safety measures and transparent risk management to ensure AI's potential is realised without compromising patient safety and undermining trust in the healthcare system.
Recording available on request: ai.incubator@uhb.nhs.uk
18th April 2024
Presented by Hamza Hussain
What are procurement frameworks?
Procurement frameworks are agreements that enable NHS organisations to buy services and goods from suppliers that have met markers of safety and usefulness.
Why should I care?
Of the estimated £30 billion the NHS spends on goods and services from other companies each year, around £25 billion is spent through procurement frameworks. It is much easier for trusts and ICBs to adopt clinical AI products if they have been listed on a procurement framework and also easier for manufacturers to sell products when listed on a framework without having to go through a full open procurement process.
Does it impact healthcare?
Yes. Procurement frameworks are in line with procurement legislation ensuring any Healthcare projects are legally compliant. It can also have a positive impact on the timelines of clinical projects being implemented and support the adoption the most promising clinical AI products.
Recording available on request: ai.incubator@uhb.nhs.uk
27th March 2024
A collaborative event with Data Science for Health Equity (DSxHE) as part of the Alan Turing Institute's AI UK fringe.
This interactive online session brought together researchers from the STANDING Together project and a few special guests for a free-flowing discussion delving into the intricacies of participatory practices in healthcare AI research. What works? What doesn’t? And when research budgets are tight and time is limited, why should this be a priority?
18th March 2024
Presented by Dr Hugh Harvey
What is an LLM?
Large language models, or LLMs, are a type of artificial intelligence that have taken in very large amounts of text to allow them to make new text by predicting sequences of words. Examples include ChatGPT and Bard.
Why should I care?
In a short time, LLMs have already become part of many people’s routine work. Using LLMs has often improved the speed people can do certain tasks at and sometimes how well they can do them.
Does it impact healthcare?
Not yet. There has been lots of interest and investment in getting LLMs to improve healthcare, but it has not been done before and people responsible for their use will have to deal with new challenges, including regulation.
Join the incubator community to receive e-mails about upcoming events!
Or email us at ai.incubator@uhb.nhs.uk