Weekly Business Insights 01/02/2024
2024 outlook: AI focus will shift to specialized machine learning that addresses specific business problems. How GPT and LLMs could usher in an era of 'intelligence-based medicine'. AMA: AI needs governance policies prior to adoption and physician input on the front-end.
2024 outlook: AI focus will shift to specialized machine learning that addresses specific business problems.
2024 will see a boom inartificial intelligence in healthcare that will lead to a greater scrutiny of AI's various processes. The AI boom will result in a skills gap and a need for more specialist IT training. And while the use of AI in healthcare grows, it won't be generative AI like Chat GPT. In healthcare, the focus will shift from general AI to more specialized, contextual AI and machine learning systems that address specific business problems effectively.
Specialized AI systems can be developed to address precise medical challenges such as disease diagnosis, treatment planning and patient management. Unlike general AI, these specialized solutions can be tailored to adhere to medical protocols, understand medical billing and codes, understand healthcare regulations, and ensure patient safety, making them more suitable for healthcare applications.
Healthcare IT leaders will discover they can solve many of their business challenges using purpose-built applications – 90% of it originating from needing access to, and human-like understanding of, their own data and processes.
How GPT and LLMs could usher in an era of 'intelligence-based medicine'
There has been a lot of technology that has come and gone in the health space. But with artificial intelligence, especially in the year-plus since Chat GPT captured the attention of the general public in a big way, "this time feels a bit different."
Whatever the use case or type of automation technique – clinical or operational applications, broad and narrow AI, predictive analytics or generative and LLMs – "AI is no longer a futuristic concept," It's a daily reality in our hospitals and clinics.
AI has great potential – but with that comes great responsibility to deploy it correctly. There are significant issues around patient safety, model transparency, privacy and security, efficacy, regulation, education and more that need urgent addressing.
How GPT and LLMs could usher in an era of 'intelligence-based medicine' |Healthcare IT News
AMA: AI needs governance policies prior to adoption and physician input on the front-end
Whatever is promulgated for AI must be safe before it's brought to the market. Physicians must be brought in on the front-end of development to make sure AI works. Currently, much needs to be done to bring physicians into the loop. AI and machine learning can improve outcomes, reduce administrative tasks and improve management, but there may be an over reliance on the tools if the risk in implementation makes inefficiencies worse.
Clinical decisions influenced by AI must be made with human intervention points, he said. Also, implementation of AI should avoid exacerbating pain points for physicians. Physicians have experienced the burden of new healthcare technology. Electronic health record implementation has topped the dissatisfaction list for physicians for years.
Above all else healthcare AI must be designed and deployed in a manner that is ethical, equitable, responsible and transparent, and needs governance policies prior to adoption and physician input on the front-end.