A team of researchers from Amsterdam UMC is planning to build models to enable greater access to AI systems in the healthcare sector.
Most of our medical information is unstructured. A conversation with a GP, a recommendation from a pharmacist, or even a consultation with a specialist, none of this information is recorded in an organised manner. While this is not a problem for doctors, it’s an unsurmountable problem for AI algorithms.
To overcome this issue, a team from Amsterdam UMC is leading a project to find ways to “help” AI reaching its full potential by giving it access to this information. The aim is to “tackle the important challenges that hinder its use in clinical practice,” said Assistant Professor Iacer Calixto. “We need to devise methods that are human-centred and responsible by design if we want these methods to be implemented in practice. “The project will rely on Natural Language Processing (NLP) techniques — the same ones that already underpin ChatGPT.
Currently, the nature of this unstructured information means that software like ChatGPT cannot be easily used in the medical sector. However, many believe that the software could be useful in this sector. Improving data entry could free up crucial time doctors and nurses can spend on patient care instead.
For this project, the patient’s safety and privacy is crucial. “Protecting the privacy of our patients is a top priority at Amsterdam UMC, and that isn’t different when we are developing, testing, or using AI algorithms,” said Mat Daemen, vice-dean of Research at Amsterdam UMC. To achieve this, the project will use “synthetic” patient records based on simulated information. These records are similar to real patient records and can be used to facilitate healthcare and research but still protect the information of the “real” patients.
“One of the main bottlenecks of doing research in healthcare is access to high-quality data to train and validate machine learning models. Part of our project will generate synthetic patient records that include not only structured but also unstructured data, such as free-text highlights from a consultation with a GP. These synthetic records, though not from real patients, can still be very useful to enable easier access to high-quality healthcare data for researchers and clinicians,” said Calixto.
Another important aspect that is preventing the generalised use of AI in health sectors around the world is language. Software like ChatGPT is predominantly English. The team now wants to build models that are trained on Dutch medical records to make them easier to use for professionals on the wards or in the treatment room.
The team believes this is a bold project. The results will benefit the entire Dutch healthcare system, including other hospitals and university medical centres. “This project is an important addition to the efforts of many experts in Amsterdam UMC and in the Amsterdam region to introduce and use AI tools in a human centred and responsible way,” concluded Daemen.