Artificial intelligence, medicine and education: Balancing innovation, opportunity and risk

Discussion about artificial intelligence (AI) is everywhere. Stories about the latest developments can, at times, seem like optimistic science fiction or like predictions of doom. The reality of AI on the ground is much more nuanced. For example, at University Health Network in Toronto where I am chief medical officer, we have used AI in planning radiation treatment for cancer for a decade. It has reduced planning time per patient from hours to minutes.i It has also streamlined the work of oncologists, radiation therapists and physicists without radically disrupting their lives.ii By applying quality control and monitoring mechanisms, its use is safe and beneficial to patients through enhanced treatment capacity. 

More AI applications are appearing at our hospitals all the time, including in enhanced cardiac and intensive care and, most recently, as part of an early warning system across our inpatient services that flags to the clinical team patients whose vital signs signal potential deterioration. These transitions are complex and require new skills and competencies on our team. To guide us, we have appointed an AI scientist to our executive team.iii There are many considerations in the clinical adoption of AI, including legal and ethical issues, mitigation of risks and the impact on patients and staff. 

AI and education: Reduced exam creation workload, testing times

The adoption of AI specifically in education will be equally transformative. Generative AI successfully challenging medical certification exams has certainly focused many minds.iv The Royal College commissioned a Task Force on Artificial Intelligence in 2019 and has hosted three research-focused panels looking at the future of AI in medical education. The most recent of these focused on AI and assessment, where an international panel highlighted recent innovations and discussed emerging possibilities and challenges at the intersection of AI and assessment. This was the first of many future discussions.  

Watch a recording of the September 18, 2024, Research Forum, “At the Intersection of Assessment & Generative Artificial Intelligence: The Future is Now.”

 

AI is already shaping exams. Because processes of examination are data-rich and computer administered, AI has been introduced successfully in areas such as automated item generation (AIG) and computer adaptive testing (CAT). These innovations promise to reduce the workload of exam creation, as well as testing time for candidates. And, if exam creators employ sufficient human vetting of computer-generated materials and monitor the conduct and results of computer administered tests, these will be valuable, low risk uses of AI. 

A bit more challenging is the rapidly evolving field of natural language processing, which reignites the possibility of utilizing written tests and may even herald a renaissance in essay writing. But as professor Shiphra Ginsburg, MD, FRCPC, PhD, MEd, outlined in her panel presentation, there is more nuance and context in rating language than numerical data. She goes on to point out that while AI is already being used effectively for language-based tasks (e.g., creating meeting minutes), and summarizing patient encounters through ambient listening, the use of AI to judge the quality of medical notes, essays or perhaps even letters of recommendation, still requires careful study. 

The rise of learning analytics

Because AI can integrate multiple kinds of data, the field of learning analytics is flourishing. Professor Martin Pusic, MD, FRCPC, PhD, a long-time advocate of developmental learning curves, foresees the addition of AI in the collection, integration and interpretation of multiple sources of information to illustrate how individual learners are progressing over time compared to expected trajectories. This approach will enhance models of competence-based education that rest on the premise of competence development over time. The rise of learning analytics is anticipated to help overcome the tradition of holding most (or all) high stakes assessment at the end of learning rather than what learning science shows is actually more effective – the use of continuous low stakes testing with continuous feedback. 

Another innovation in this area comes from mobile assisted language learning apps, which anyone with a smartphone can easily access. Medical educators are starting to explore this approach as well, given the strong evidence showing that spaced repetition helps with learning and memory. After all, studying medicine requires mastering a new language and repeatedly applying basic scientific concepts,  not unlike understanding the grammar of science.

Illustration_AI and medicine_iStock

Preventing harms

But we should also heed the warnings of knowledgeable critics (including recent Nobel Prize laureate Geoffrey Hinton v) to exercise caution and take steps to prevent harms. Some harms we are not yet even aware of, while others have already been identified, including biased databases, too much computer autonomy, computer hallucinations and, most importantly, the overemphasis on efficiency at the expense of the human presence necessary for competent care and effective healing. For me, the greatest risk, however, is creating health care environments that overemphasize data collection and make learners and clinicians feel that they are under constant surveillance.

All health care institutions have difficult but important decisions to make about AI, including the Royal College, which is leading a national conversation about the use of AI in medicine and education. We are looking forward to hosting future panels on AI at the intersection of medical education; I hope you will join us on this journey. 

Brian

Brian Hodges, MD, FRCPC, PhD
President


Brian Hodges is the 47th President of the Royal College of Physicians and Surgeons of Canada. He is the executive vice-president of education and chief medical officer at the University Health Network. A professor in the University of Toronto’s Temerty Faculty of Medicine and at the Dalla Lana School of Public Health, as well as a senior fellow at Massey College, Dr. Hodges is also a practising psychiatrist.


i Conroy, L., Winter, J., Khalifa, A., Tsui, G., Berlin, A. and Purdie, T.G., 2024. Artificial Intelligence for Radiation Treatment Planning: Bridging Gaps from Retrospective Promise to Clinical Reality. Clinical Oncology. 

ii Gillan, C., Milne, E., Harnett, N., Purdie, T.G., Jaffray, D.A. and Hodges, B., 2019. Professional implications of introducing artificial intelligence in healthcare: an evaluation using radiation medicine as a testing ground. Journal of Radiotherapy in Practice, 18(1), pp.5-9. 

iii https://www.uhn.ca/corporate/News/Pages/UHN_becomes_first_Canadian_hospital_to_appoint_Chief_AI_Scientist.aspx

iv Kung TH, Cheatham M, Medenilla A, Sillos C, De Leon L, Elepaño C, Madriaga M, Aggabao R, Diaz-Candido G, Maningo J, Tseng V. Performance of ChatGPT on USMLE: Potential for AI-assisted medical education using large language models. PLOS Digit Health. 2023 Feb 9;2(2):e0000198. doi: 10.1371/journal.pdig.0000198. PMID: 36812645; PMCID: PMC9931230. 

v https://mitsloan.mit.edu/ideas-made-to-matter/why-neural-net-pioneer-geoffrey-hinton-sounding-alarm-ai