Content, News

From AI to support colonoscopies to strokes: the diverse applications of AI in healthcare

In September, 42 AI projects joined the NHS AI Lab through its AI in Health and Care Award programme.

The programme is run in partnership with the Accelerated Access Collaborative and NHSX, and forms part of the NHS Artificial Intelligence Lab announced by Matt Hancock in 2019. £140m is to be invested over three years to help to accelerate the testing and evaluation of AI technologies. More recently a second call for projects closed for round 2 of the programme.

In this article we take a look at some of the projects that joined round 1 of the programme. From early stage developers to established suppliers, here are some of the ways AI is currently being utilised, and could be utilised in health and care:

Odin Vision – CADDIE

A spin-out company heralding from UCL was, back in September, awarded funding to progress the development of the CADDIE artificial intelligence system.

The company behind the project, Odin Vision, has been developing the technology to support doctors detect and characterise early signs of bowel cancer.

The CADDIE system analyses live colonoscopy video to identify and characterise polyps; the system itself is based on academic research by UCL AI expert Professor Danail Stoyanov.

According to Odin Vision, ‘25% of polyps can be missed’ during diagnosis due to the ‘challenging’ nature of detection. The system therefore provides AI decision support by ‘highlighting areas of the large bowel with visual characteristics consistent with different types of colonic polyps.’

Ultromics – EchoGo

The AI solution EchoGo, developed by Ultromics, helps to predict heart disease in patients and is being implemented in NHS hospitals across England.

The system analyses ultrasound scans of the heart using machine learning algorithms, aiming to improve the efficiency of heart disease diagnosis in order to prevent heart attacks.

EchoGo is a cloud AI service to automate echocardiography analysis in order to allow clinicians to save time in their workflows. The cloud service automates cardiac measurements from echocardiograms and is said to deliver ‘near instant’ reports with ‘precise’ results.

Developers Ultromics are a health-tech company based in Oxford and are a spin-out company from Oxford University.

EchoGo, as of December 4th, is now available through the NHS supply chain framework and development of the system has been taking place across 30 NHS sites since 2017.

Kheiron – Mia

Mammograms are read by two independent radiologists in order to determine whether the patient needs to return for further assessment, ‘double reading screening’; the AI system Mia, has been studied as a potential replacement to one of these readings.

When tested across 40,000 mammograms, Mia as a second radiologist showed almost identical accuracy in cancer detection with the reading time for the second human being reduced by 81.9%.

Kheiron states that Mia with one human reader provides a ‘meaningful solution capable of addressing the workforce crisis in breast screening’ where the UK as always screens breast cancer with two radiologists, as opposed to the US which uses only one radiologist.

The Mia system suggests a decision for the entire case the same as an expert mammographer would. Mia also explains regions of interest where relevant areas on a scan can be marked as a low false positive rate.

FindAir – AsthmaAI

FindAir has developed an asthma AI system to identify the main factors for asthma triggers in patients. The system uses data gathered by FindAir’s monitoring system and provides ‘real-time predictions’ of asthma and COPD exacerbations.

The AI software operates in the background on a patient’s smartphone whilst the patient’s inhaler has a device dubbed ‘FindAir ONE’ mounted on the top of the inhaler’s canister.

The patient uses their inhaler in the same manner as usual and FindAir ONE gathers information from each inhaler use. This allows the patient and their GP to have a deeper understanding to what causes asthma exacerbations, thus potentially avoiding future attacks.

Collected information provides a model to inform the patient on what causes symptoms to be exacerbated; grass pollen and air quality in a patient’s area being two variables that the system can advise upon.

Ufonia – Autonomous Telemedicine 

Ufonia have partnered with Buckinghamshire Healthcare NHS Trust, Oxford University Hospitals NHS Trust, and Imperial College Healthcare NHS Trust to roll out their autonomous telemedicine system.

The system uses AI to monitor patient health through a ‘conversation with a medical device voice chat bot’ for clinical follow up.

Ufonia have been working with Oxford AHSN since 2017 and built a system to assess outcomes post-knee surgery using the ‘Oxford Knee Score’.

The project is currently projected to run until April 2021 and is focused on follow-ups to post-cataract surgery patients.

Ufonia state that the system is accessible to all patients and requires no new digital technology skills, just ‘a conversation on the phone’, and is scalable so as to be able to support ‘care pathways across the world.’

The benefits of the system allow staff to be freed from repetitive tasks so as to be able to deliver higher value care where it is most needed.

Neuronostics – BioEP

The BioEP platform is an AI based system for faster and more accurate diagnosis of epilepsy and to monitor treatment with anti-epileptic drugs (AEDs).

The system works by creating models of the brain ‘using short segments of electroencephalogram recordings’, according to Neuronostics, where computer simulations can quickly ‘reveal the ease with which seizures can emerge and form the basis of the BioEP seizure risk score.’

Neuronostics partnered with the University of Birmingham where co-founder Professor John Terry is Director of Centre for Systems Modelling & Quantitative Biomedicine.

BioEP has previously achieved a 72% diagnostic accuracy based on a 20 second EEG recording; in the current clinical pathway, this same recording would usually yield inconclusive results.

Professor Terry explains: “we build personalised models of the brain using EEG that is routinely collected when seeking to diagnose epilepsy.”

“From these models the risk of epilepsy can be quickly determined. In contrast, multiple EEG recordings are often required to reach a clinical diagnosis at present.”

“This is expensive, time-consuming, and exposes people with suspected epilepsy to risk.”

Deontics – Cognitive Computing

Deontics claims to be a pioneer in the cognitive computing branch of AI and looks at how humans assess evidence and subsequently make decisions. Deontics differentiates between ‘standard’ clinical decision support systems (CDSS) and ‘cognitive’ CDSS, where the former deals with ‘if then’ rules engines and decision trees which Deontics state ‘cannot deal with uncertainty or complex non-linear disease management.’

Cognitive CDSS on the other hand is said to be dynamic and able to adapt for ‘situationally-adaptive execution’ through utilising ‘cognitive principles’ and relying on ‘augmentation rather than if-then logic’.

This in turn creates a ‘GPS’ for the clinician in order to guide them through the clinical decision-making process for a patient in any clinical environment including primary, secondary, or tertiary care.

September of this year saw Deontics win a Phase III AI in Health & Care award.

The award will be used to evaluate the capability of Deontics AI Clinical Decision Support (CDS) in stratifying cancer patients and supporting prostate cancer MultiDisciplinary Team meetings.

It will also evaluate the benefits of access by patients to the technology in shared decision making.

Aidence – Veye

Veye Chest is a deep learning medical device which is used for the detection and analysis of pulmonary nodules on CT chest scans and ‘can be used in screening settings and routine clinical practice’ according to Aidence.

Aidence state that Veye Chest can be integrated into any PACS.

The system allows the user to view a chest scan on the software dashboard which appears on the left-hand side, with a summary of what the Veye AI has discovered on the right-hand side, allowing the radiologist to toggle on and off certain findings.

The system can detect solid and sub-solid nodules between 3mm and 30mm in size and can quantify nodules in terms of diameter and volume with 3D visualisation.

Also, Veye Chest can provide a growth assessment and the volume doubling time of nodules.

Aidence claims that Veye Chest has a detection sensitivity of 90% and aims to reduce the risk of medical errors through deep learning techniques.

Brainomix – e-Stroke Suite (e-ASPECTS/e-CTA)

Brainomix have developed an AI platform for treatment decision support from non-contrast CT scans assessing stroke signs.

The Alberta Stroke Programme Early CT Score (ASPECTS) is automated and standardised by the e-ASPECTS system and measures the volume of ischemic signs.

The software provides a heat map from the assessment of non-contrast CT scans to indicate areas of detected hypodensity. Regions of the brain detected as containing signs of hypodensity are outlined in red so as to be clearly presented to the clinician.

The volume of hypodensity is also highlighted in pink and the measurement displayed. Scan results can be made available in PACS, in an email notification, and/or also on a mobile application.

The e-CTA software is also a decision support platform using AI and large data analytics to detect Large Vessel Occlusion (LVO) and Collateral Assessments.

The software automates the CTA-CS collateral score, which ‘predicts a patient’s response to thrombectomy’. The software has ‘been used to select patients for intervention up to 12-hours from symptom onset.’

Like with the e-APSECTS software, e-CTA provides visual reports with a colour heat map, where detected LVOs are circled in red, and like the e-ASPECTS software, results are available in the same format.

To read more about the other innovations in the programme, please see: