April 3, 2025, David Mitchell (Kansas City, Missouri) — Whether you are medical school faculty or a residency program director, how you feel today about the use of artificial intelligence in academia might be irrelevant.
What will matter in the long run is how you adapt your teaching to the technology.
“I’ve ‘low-key’ learned more from AI than I have in class,” a medical student recently told Winston Liaw, M.D., M.P.H., chair of health systems and population health sciences at the University of Houston’s Tilman J. Fertitta Family College of Medicine.
Although the student may have wanted that admission to be anonymous, the sentiment didn’t surprise Liaw, a mainstage speaker March 25 during the Residency Leadership Summit in Kansas City, Missouri.
“AI is here,” said Liaw, who spoke to a packed ballroom (and an overflow audience watching on large-screen monitors). “It’s not going away, and it’s being used extensively by our students.”
ChatGPT was banned in some academic settings soon after it was introduced in late 2022 amid fears that the technology could be used to write essays and answer test questions.
That reaction wasn’t much different from how mathematicians felt about the introduction of calculators in the 1970s, Liaw said. Educators initially feared students would become too reliant on the devices before eventually deeming them essential learning tools.
“ChatGPT is here to stay,” Liaw said. “I suggest you embrace it.”
Some embraces may be warmer than others.
Liaw highlighted a 2023 Lancet Oncology study that found the use of AI detected 20% more instances of breast cancer compared to routine double reading of mammograms by two radiologists, and use of AI did not increase the rate of false positives.
However, the chatbot didn’t fare as well in a more recent study that examined its accuracy in following advanced cardiovascular life support guidelines for cardiac arrest and bradycardia. ChatGPT's median accuracy for each step of cardiac arrest was 85%, but it fell to 30% for bradycardia.
Artificial intelligence is only as intelligent as the data and information available to it. That’s why Liaw urged family medicine educators to become involved in the development of AI tools.
The alternative, he said, is to repeat history.
“Electronic health records were built without consideration for the complexity of primary care,” said Liaw, a family physician who noted EHRs have added to documentation burden, reduced face time with patients and contributed significantly to physician burnout. “To change our path, we need your help.”
Products like Google’s NotebookLM allow users to upload materials — from notes to entire textbooks — to generate study guides, quizzes, frequently asked questions and even interactive podcasts.
Liaw, however, said family physicians are needed to work with developers, train AI and tailor it to primary care. He and colleagues, in conjunction with the Society of Teachers of Family Medicine and the American Board of Family Medicine, already are developing an Artificial Intelligence and Machine Learning for Primary Care curriculum for medical students, primary care residents, and practicing primary care clinicians.
The free curriculum has two video modules available, and three more are in development. Liaw urged residency programs interested in piloting the curriculum to contact him.
He said chatbots can be customized with content that primary care physicians know and trust. Key questions educators need to ask as they consider implementing such tools are, does the technology help students become better doctors and will it help patients live better lives.
Another key question is, when should learners be allowed to use AI?
Liaw said he recently spoke with residents who did not want to use AI for note writing because they wanted to master the skill themselves.
“They need to learn,” said Liaw, who is a member of the Advancing AI and Digital Health for Primary Care Initiative, a collaboration of the AAFP and Rock Health. “We need guardrails about when to use it or not.”
Liaw said learners should achieve competencies with artificial intelligence before using it in practice. Such competencies include being able to describe a tool, knowing how to use it to increase diagnostic accuracy, understanding indications for its use and how it is used, being able to communicate with patients about why it is used, and mitigating potential harms.
Liaw shared a story of a patient who made an appointment through a chatbot after another AI tool identified her as being at high risk for heart failure complications. AI tools also summarized her visit and sent prescriptions to her pharmacy. And an AI analysis using facial recognition scanning suggested the patient was depressed. Further discussion revealed she had not taken her medications as prescribed.
“Automation is coming,” Liaw said. “It’s going to affect how we deliver care and how we teach.”
The Residency Leadership Summit drew more than 1,300 residency program directors, program coordinators and other stakeholders. The 2026 summit will be held March 4-6 in Dallas.