In the past year since ChatGPT first opened up to new accounts, it seems that all business are looking for ways to incorporate it into their practice, not only to look for ways to increase efficiency and save money, but perhaps also to appear forward-looking and to join the trend and not miss out. I've mentioned that medicine and healthcare delivery have been one of the ways that those in the executive suite have been looking to incorporate into their business – or practice. It's helpful to review that artificial intelligence is not just chatbots or Natural Language Processing. It is also computer vision and deep learning, which itself comprises classification, modeling and prediction. We've seen how computer vision advances have help the radiologists, cardiologist and neurologists interpret their visual clinical data. Time sequence models, such as LSTM, have helped predict usage trends, which is of value to accountants and planners. The New England Journal of Medicine, in their article on Artificial Intelligence in U.S. Health Care Delivery summarized the state of the technology in healthcare in 2023. As I have discussed before, the areas where it has been implemented have been in the insurance reimbursement area, to help with claims tracking and audits. The article discusses how modeling has helped improve efficiences in operating room utlization. In the clinical world, the use of A.I. discussed in the article has been in using deep learning models to predict sepsis, or predict clinical outcomes in the ICU or emergency department, looking for factors that predict readmission or death. These are the low-hanging fruit scenarios, and mainly represent the application of data science techniques to various deep learning architectures, rendering operations more efficient.
The article mentioned the slow adaptation of A.I. in healthcare delivery. The reasons for this? One is the "variability and heterogeneity" of the data, which is understandable. This includes data generated by all the numerous sensors and imaging modalities (sometimes with audio as well as visual components), and the massive corpus of text information (both handwritten and printed). Before any data can be used by an A.I. system, there is need for preprocessing. Everything must be translated into a language that the learning architecture and a database, can process. This usually mean conversion to vectors (or tensors), but the question then becomes, whose format shall be used? There is now a trend to vector databases, which would be especially helpful in the medical world, since the old system of classifying things by human-created categories is laborious and slow. Furthermore, it would be difficult to correlate someone whose disease was given one of the R category of ICD-10 codes with more specific and precise codes. Vector or tensor databases promise to hold multimedia data, and enable trainable queries that will find hidden associations between conditions. At present, I suspect that most stored information is human-entered, and represents only a small subset of the information generated in patient encounters. All other data is still likely represented in formats standard to their media, such as PDF, JPG, WAV or MP4 files. EHR vendors, such as Epic, still store patient data in SQL-based servers, however Oracle is one of their database providers, and Oracle is investing in vector databases, so the technology may change.
At this time, I think it's still difficult to know how far chatbots and Natural Language Processing will be used in the clinic to perform duties of clinicians. A promising effort was made previously with the app Babylon, but it failed spectacularly. Sadly, it was going down in flames just as the new chat technology was in ascendance. I suspect that the developers unable to convince investors to give them more time to utilize the new transformer model in Natural Language Processing, which was showing the world just how awesome it was compared to previous chatbot technology.