As mentioned in our new year blog PHE is working on new guidance to help developers of artificial intelligence (AI) understand the process for incorporating their new technologies into screening programmes.
We are developing guidance documents for AI developers.
The UK National Screening Committee (UK NSC) has approved interim guidance for those proposing to use AI in breast screening mammography. We will also shortly produce general guidance about when and how companies should engage with PHE about their AI products.
AI developers should get in touch with the UK NSC evidence team to request the interim AI in mammography guidelines. Feedback on this interim guidance is welcomed. We would also like to hear about what you’re developing and researching. This will help us to produce our guidance and plan for the future.
A brief introduction to AI
AI refers to a range of activities that computers can do, where they think and learn in a similar way to humans. It includes things like:
- understanding human language
- learning how to do complex tasks like playing chess
- developing expertise in areas such as medical diagnosis
- replicating how people speak and see
- robotics and automation (performing tasks with less help from humans)
AI in screening
There have been lots of clever advances in AI over the past few years. In terms of screening, this could mean in future computers being used to:
- find cancers on a mammogram or to review breast biopsies
- identify people at higher risk of a condition from their medical notes
- help staff to organise and deliver services more efficiently, perhaps by learning where and when clinics should be run
Many developers will want to work with screening programmes to undertake development and research. This may include:
- using data to train AI systems
- using expert knowledge to test and develop tools
- collaborating with local services on research projects
The government has voiced its support for the use of AI in the health service and recently funded a large study into the use of automated mammogram reading in the East Midlands.
PHE Screening is also keen to support and encourage AI developers. However, screening can do harm as well as good, which is why changes to screening programmes is guided by recommendations from the UK National Screening Committee (UK NSC). Any potential change to a screening programmes – including the addition of new AI technologies – would need to be reviewed by the UK NSC.
Code of conduct
Because of the complex ethical and practical challenges that face AI developers, the government has published a code of conduct for data-driven health and care technology. This sets out 10 principles for safe and effective digital technology.
AI developers should follow this code to ensure they’re developing products which could be used in national screening programmes. In return, the government is committed to encouraging innovation and making it easier for them to work with the NHS.
Use of data
Personal medical data held by the NHS has the potential to support the development of cutting edge new AI technologies. However, this data is confidential and can only be shared with researchers and companies in line with new General Data Protection Regulations.
Where data is held by PHE, it will be shared only when approved by the PHE Office of Data Release (ODR). The ODR will establish if there is a legal basis to share it and ensure this is done in a safe and secure manner.
These protections are to ensure the public feel confident their data is being held safely and in confidence.
PHE is planning to invite suppliers to come and show us what they’ve been working on that could be relevant for screening. We’ll blog about this planned day when we have more details.
PHE Screening blogs
PHE Screening blogs provide up to date news from all NHS screening programmes. You can register to receive updates direct to your inbox, so there’s no need to keep checking for new blogs. If you have any questions about this blog article, or about population screening in England, please contact the PHE screening helpdesk.
Comment by Irene Stratton posted on
Maybe it would have been helpful to mention the use of AI in interpretation of digital retinal images as well. This has been approved in the USA and is used routinely in the Netherlands - with the images not being read by technician graders unless the software decides that the images are ungradable.
There has been an HTA project looking at the performance of two of the softwares in routine images (Tufail, Adnan, Rudisill, Caroline, Egan, Catherine, Kapetanakis, Venediktos V., Salas-Vega,
Sebastian, Owen, Christopher G., Lee, Aaron, Louw, Vern, Anderson, John, Liew,Gerald, Bolter, Louis, Srinivas, Sowmya, Nittala, Muneeswar, Sadda, SriniVas, Taylor,Paul and Rudnicka, Alicja R. (2016) Automated diabetic retinopathy image assessment software:
diagnostic accuracy and cost-effectiveness compared with human graders. Ophthalmology.