Accurate data and records are important for treating patients in hospitals. One mistake can cause death for example if the patient's allergies are not stated in the record. Now appears the artificial intelligence (AI) Whisper report developed by OpenAI to record patients found to be hallucinating which poses a risk to the patient.
Whisper is used in an application developed by Nabla that records before producing a transcription of a conversation between a doctor and a patient. In an Associated Press report, as many as 1% of transcriptions produced contain hallucinations causing inaccurate patient records to be produced. The 1% figure seems small but Nabla has recorded over 7 million conversations and this means 70,000 inaccurate transcripts have been produced.
More seriously, researchers found that Whisper also created diseases that did not exist. The issue became more serious because after the transcript was produced, the audio recording was deleted for privacy reasons. So the process to edit any mistake that is now a permanent record becomes almost impossible.
Whisper has previously caused controversy for OpenAI because it is believed to have been trained by listening to the audio of YouTube videos without the permission of Google or the original video owner. In the same development, Nabla said they have been informed about this issue and are trying to resolve it through an application update.