The Hazards of Using ChatGPT: Additional Comments

Amnugy Kleebayoon | Viroj Wiwanitkit

PRiMER. 2024;8:12.

Published: 2/19/2024 | DOI: 10.22454/PRiMER.2024.203721

To the Editor:

We would like to respond to the publication "The Hazards of Using ChatGPT: A Call to Action for Medical Education Researchers”1 by identifying another issue with the usage of ChatGPT in medical education. While the letter already discusses some important limitations and probable unwanted situations linked to ChatGPT, there are still additional points for consideration. Reliance on ChatGPT without taking into account other variables, such as physical examination findings, might result in incorrect or inappropriate information. It is critical to recognize that ChatGPT's output is sometimes incorrect, and it might have difficulty distinguishing between fake and true sources. Furthermore, the report is restricted to data up to September 2021, and training data are lacking, therefore the application may generate new data, which might be incorrect.

Some authors' previous replies address some important points,2-4 but we believe there are still some difficulties to be addressed. This highlights the need for a complete strategy in medical education that can take into account various sources of information including clinical trials, to allow informed decision-making. Furthermore, the tool's assessment of learners might be limited owing to its absence of audiovisual information, thus, nonverbal cues such as tone and facial expressions are ignored, which can indicate comprehension or lack thereof. This addresses the significance of adopting various channels of communication in medical education to ensure a deep comprehension of learners' talents and needs. Nevertheless, the letter highlights the necessity of evaluation and notes that previously-published AI competencies should be utilized to guide ChatGPT integration into classroom instruction. Instructors and students can utilize this framework to explain the instrument, analyze the evidence supporting it, pick appropriate indications for its use, efficiently run it, communicate its results, and detect any undesirable outcomes. This technique can help identify knowledge gaps that need additional inquiry. As a result, ChatGPT will be used in medical education in a more informed and responsible manner.

Guidance for responsible use of ChatGPT should also highlight the possibility of disclosing protected health information during a chat with an AI used for clinical decision support. This can state the importance of carefully considering the privacy and confidentiality concerns of using computational tools in medical education, as well as ensuring that suitable measures for sensitive patient data are in place. We propose using the previously-defined AI competencies5 in order to facilitate proper tool output interpretation, evaluation, and communication. Extensive study and assessment are needed to identify and manage hazards as well as close knowledge gaps.  Furthermore, current approaches and a large training set are necessary for eliminating bias and errors in using AI. An overdependence on one primary data source poses risks and may result in errors. The use of AI presents ethical problems since it will have unforeseen and undesirable results. Setting moral guidelines and limitations is essential to preventing misuse of large language model-based artificial intelligence models as they evolve.


  1. Liaw W, Chavez S, Pham C, Tehami S, Govender R. The hazards of using chatgpt: a call to action for medical education researchers. PRiMER Peer-Rev Rep Med Educ Res. 2023;7:27. doi:10.22454/PRiMER.2023.295710
  2. Horton JA, Ally I. Response to "exploring the applications of chatgpt in family medicine medical education". PRiMER . 2023 Aug 25:7:28. doi: 10.22454/PRiMER.2023.940827.2. Choi HS, Song JY, Shin KH, Chang JH, Jang BS. RadiatOncol J. 2023 Sep;41(3):209-216.
  3. Liaw W, Chavez S, Pham C, Tehami S, Govender R. The hazards of using chatgpt: a call to action for medical education researchers. PRiMER Peer-Rev Rep Med Educ Res. 2023;7:27. doi:10.22454/PRiMER.2023.295710
  4. Hanna K. Exploring the applications of chatgpt in family medicine education: five innovative ways for faculty integration. PRiMER Peer-Rev Rep Med Educ Res. 2023;7:26. doi:10.22454/PRiMER.2023.985351
  5. Kleebayoon A, Wiwanitkit V. artificial intelligence, chatbots, plagiarism and basic honesty: comment. [Comment]. Cell Mol Bioeng. 2023;16(2):173-174. doi:10.1007/s12195-023-00759-x

Lead Author

Amnugy Kleebayoon

Affiliations: Private Academic Consultant, Samraong, Cambodia


Viroj Wiwanitkit - Center for Global Health Research, Saveetha Medical College, Saveetha Institute of Medical and Technical Sciences, Chennai, India

Fetching other articles...

Loading the comment form...

Submitting your comment...

There are no comments for this article.

Downloads & Info


Related Content


Searching for articles...