Ideally, educators should use the best available evidence to make decisions about their practices as teachers, scholars, and policymakers. However, the rapid increase of scholarly literature in medical education poses a major challenge. Knowledge syntheses (aka reviews), which contextualize and integrate information into a single resource, have become essential tools for navigating this information overload. This article presents an overview of knowledge synthesis in medical education, starting by defining it and providing an overview of the general steps. It then examines four key types of syntheses: systematic reviews, scoping reviews, meta-reviews, and realist reviews, providing examples of each type and, when possible, pointing to reporting guidelines and resources for conducting the type. The article then addresses common methodological pitfalls, including inadequate time planning, limited collaboration with end-users, insufficiently actionable findings, and narrow search strategies. The article concludes by presenting emerging innovations, such as artificial intelligence-supported methodologies, living reviews, and alternative knowledge translation activities.
As the volume of medical education literature grows, educators face challenges in interpreting, contextualizing, and applying relevant findings in their daily practices, whether in classrooms, clinics, or conference rooms. Knowledge syntheses, commonly known as literature reviews, provide rigorous methods for identifying, appraising, and integrating this expanding body of publications. By combining findings from multiple publications and offering evidence-informed guidance, knowledge syntheses help bridge academic evidence and real-world application. Moreover, by distilling nuanced and technical research into accessible formats, such as summary tables, conceptual models, graphical representations, or practical recommendations, syntheses empower individuals to apply findings more effectively.1 Considering these benefits, medical education scholars have asserted that knowledge syntheses are as important as, if not more than, primary studies.2
In this era of exponential knowledge growth, knowledge synthesis has emerged as a critical methodological approach within medical education.2 Reflecting this growth, the number of knowledge syntheses published in medical education has increased by over 2,000% between 1999 and 2019.3 Moreover, compared to other publication types, knowledge syntheses are highly cited, frequently featured on social media, and heavily downloaded, indicating that they play an important role in the field’s discourse.3
This article provides an overview of knowledge synthesis in medical education to familiarize readers with the methodology, raise points of concern, and highlight future opportunities. It is structured to describe: (a) what knowledge synthesis is; (b) types of knowledge syntheses, (c) common challenges; and (d) future directions.
WHAT IS KNOWLEDGE SYNTHESIS?
Knowledge synthesis is the process of integrating existing findings from multiple publications to develop a more comprehensive understanding of a topic. Unlike traditional literature reviews, which often appear as introductions of academic articles and may be selective or narrative in nature,4 knowledge syntheses follow structured, transparent methodologies explicitly designed to promote reproducibility and enhance the reliability of findings. Knowledge synthesis can take many forms,5,6 such as systematic reviews, scoping reviews, and realist reviews, each with distinct methodologies tailored to specific types of questions, data, and objectives. Whether focused on measuring effectiveness, mapping the field, or exploring how and why interventions work, each synthesis type contributes unique value to medical education.
A knowledge synthesis typically follows a multistep process. First, researchers define a clear, focused research question, which can be guided by frameworks such as PICO (Population, Intervention, Comparison, Outcome)7 or SPIDER (Sample, Phenomenon of Interest, Design, Evaluation, Research type),8 depending on the synthesis type. Next, researchers develop and execute a literature search, often across multiple information resources, to identify relevant publications. The literature search is followed by screening titles and abstracts, and full texts against predefined inclusion and exclusion criteria. Next is data extraction, in which key information from the selected studies is systematically collected. The extracted data are then analyzed and synthesized using quantitative and/or qualitative approaches. Finally, findings are interpreted in the context of the research question and existing literature, with implications for educational practice and policy discussed. While these steps provide a general structure, in practice, they will vary based on the knowledge synthesis type being conducted.
TYPES OF KNOWLEDGE SYNTHESES
Knowledge synthesis encompasses a diverse array of types, each developed to serve different purposes, accommodate various research questions, and engage with different evidence types. Understanding the types of knowledge syntheses is essential, because choosing the right type ensures that the conclusions drawn are both valid and useful.
Across fields, researchers have identified 25 knowledge synthesis types.6 Specific to medical education, medical education journals have published 21 types of knowledge syntheses, with systematic reviews, scoping reviews, and narrative reviews being the most prevalent.3 While multiple types are conducted, this article provides a brief overview of systematic reviews, scoping reviews, meta-syntheses, and realist reviews, which are commonly used or emerging in medical education.
Systematic Reviews
Systematic reviews address focused research questions by comprehensively identifying, appraising, and synthesizing the evidence on a particular topic.9 These reviews seek to answer specific, often narrowly framed research questions related to effectiveness, causality, or measurement. For example, one systematic review examined the effectiveness of using virtual patients to provide feedback on clinical reasoning skills.10 Systematic reviews are especially valuable for informing evidence-based decision-making in educational policy and practice. Systematic reviews also can include a meta-analysis. A meta-analysis is a quantitative approach that involves the statistical pooling of quantitative results across included studies to estimate the overall effect size of an intervention.11 For example, one systematic review with meta-analysis examined the effects of synchronous distance education versus traditional education, finding no significant difference in knowledge acquisition or satisfaction.12
While powerful, systematic reviews are also resource-intensive, requiring careful planning, methodological expertise, and significant time investment. Fortunately, the medical education field offers several resources to support their development, including step-by-step guides.13,14 Many journals require systematic reviews to follow the PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) statement, which provides a 27-item checklist to promote transparency and completeness in reporting.15
Scoping Reviews
In medical education, the publication of scoping reviews is on the rise.16 Researchers typically conduct a scoping review to (a) examine the nature of the literature on a topic, (b) identify gaps in the evidence base, (c) determine whether undertaking a systematic review is appropriate, and/or (d) summarize and disseminate the findings of the topic’s literature.17,18 For example, in a scoping review on artificial intelligence (AI), the authors described their rationale: “We chose a scoping review methodology, mapping out the landscape of the existing literature, to delineate the evidence bases, classify methodologies, and highlight thematic content ripe for systematic investigation.”19 As in this example, scoping reviews are often described as “mapping” a topic’s literature landscape.17 The importance of this mapping function is highlighted in an editorial from this journal: “Our discipline [family medicine education] needs more scoping reviews, because their value is unique. Without maps, how will we know where to go?”20
To create a complete map of a topic’s literature, scoping reviews are intentionally broad and flexible in the publication types they include. In addition to peer-reviewed articles, they may incorporate alternative publication types such as policy documents, theses, meeting abstracts, book chapters, or websites. Additionally, authors are increasingly involving stakeholders or community members in the review process to increase the usability and relevance of this mapping exercise. For example, in a scoping review on the hidden curriculum, the authors shared preliminary findings with clinical teachers to examine how the review’s findings aligned with their related lived experience.21 This engagement helps validate the findings and increases the scoping review’s practical impact.
Scoping reviews require a systematic and comprehensive approach,22 which can be resource- and time-intensive. To guide this process, many researchers rely on the widely used six-step framework developed by Arkey and O’Malley17 and updated by Levac and colleagues.23 However, for those conducting medical education scoping reviews, Mak and Thomas’ step-by-step guide provides practical, field-specific recommendations.24 Additionally, authors of scoping reviews should consult the PRISMA-ScR extension for scoping reviews, which facilitates thorough reporting.25
Meta-Syntheses
As the number of knowledge syntheses has grown, a new challenge has emerged. What was once a problem of too many primary studies has now become one of having too many knowledge syntheses. This abundance can challenge educators and researchers to interpret and apply the growing body of synthesized evidence. In response, the meta-synthesis has gained traction. A meta-synthesis, not to be confused with a meta-analysis, integrates findings across multiple existing knowledge syntheses, providing a higher-level summary of evidence on a topic.26 For example, researchers examining the impact of continuing professional development on patient outcomes encountered 63 preexisting reviews. To make sense of them, they conducted a meta-synthesis, ultimately distilling the findings from the numerous individual reviews into a single comprehensive overview.27 This approach allowed the authors to identify patterns, highlight consistent findings, and uncover gaps in the evidence in one accessible synthesis.
One form of meta-synthesis that has gained popularity is the umbrella review, which is “a systematic collection of multiple systematic reviews and meta-analyses on a specific research topic.”28 By aggregating and comparing the conclusions of several reviews, umbrella reviews can identify areas of consensus, highlight inconsistencies, and assess the overall strength of evidence. For example, Onyura and colleagues investigated the evidence for curricular approaches by synthesizing the findings of 36 systematic reviews and concluding that educational interventions produce mixed effects on learning.29
Meta-syntheses are increasingly valuable tools for guiding decision-making for topics where the literature is abundant and increasingly complex. However, because meta-reviews require the existence of multiple prior syntheses, they are less suited to emerging or underdeveloped topics.30 Additionally, while meta-reviews are becoming more popular, no medical education-specific resources exist for those hoping to conduct one. Authors may find the seminal work by Aromataris et al useful, as it outlines best practices for conducting and reporting umbrella reviews, which can be extrapolated to other meta-review types.31
Realist Reviews
A realist review, a theory-driven form of knowledge synthesis, aims to understand not just whether an intervention works, but how, for whom, and under what circumstances it works.32,33 Rooted in realist philosophy, this approach is particularly well-suited to complex interventions in dynamic settings (eg, clinical placements, academic health centers).34 Unlike many knowledge synthesis types, which focus on aggregating findings across studies, realist reviews explore the underlying mechanisms of interventions and how these interact with contextual factors to produce specific outcomes. The goal is to generate or refine a “program theory,” which is a conceptual model that explains how and why an intervention leads to specific results in certain contexts. To strengthen their program theory, authors often integrate community members into the review process, which ensures that the perspectives of those affected by the intervention are considered. For example, in a realist review on interprofessional education (IPE), the authors developed a program theory, which incorporated 124 contexts, mechanisms, and outcomes, and conducted listening sessions with community members to gauge their relevance. The resulting model led to a practical resource for educators developing, delivering, and assessing IPE.35
By focusing on real-world complexity, realist reviews can inform policy and practice in nuanced and complex settings where one-size-fits-all conclusions may be insufficient. To guide the conduct of realist reviews, researchers are encouraged to consult the RAMESES publication standards,33 which provide a structured approach for developing, conducting, and reporting realist reviews. Recent field-specific guidance also has emerged in medical education.36
Across the four knowledge synthesis types, several common challenges can undermine the quality and usefulness of the resulting work, though they may manifest in different ways depending on the approach. These challenges include underestimating the time needed, not including community members, producing findings not ready for real-world use, and not looking beyond usual information sources.
Underestimating the Time Needed
Conducting a knowledge synthesis is resource-intensive, requiring significant time, team coordination, and methodological expertise. For example, systematic reviews have been estimated to take on average 67 weeks37 and 881 person hours38 to complete. While time estimates for other knowledge synthesis types are unavailable, they are likely comparable. Unfortunately, authors may underestimate the scope and time commitment. To avoid this pitfall, authors should start with a clear and focused research question. They also may consider registering a protocol of their knowledge synthesis to help them maintain scope and direction. For guidance on protocols, consult Pieper and Rombey’s work,39 which describes available protocol registers, registration benefits, and protocol characteristics. Engaging a librarian for help with searching and project management tools also can streamline the process. Finally, setting realistic timelines and budgeting sufficient time for each phase of the process can ensure that the knowledge synthesis remains manageable and methodologically sound.
Not Including Community Members
Involving community members (aka stakeholders) in a knowledge synthesis enhances the relevance, reach, and real-world applicability of the findings.40 It also fosters transparency, builds trust, and increases the likelihood that the synthesis will inform meaningful action or decision-making. Yet, the inclusion of community members in knowledge synthesis is an exception rather than a rule, with their inclusion primarily in scoping and realist reviews, despite their value across synthesis types.
Community members, such as patients, trainees, educators, clinicians, policymakers, or other end users, can contribute to knowledge syntheses as coauthors (when appropriate) or consultants throughout the process. Their involvement can help shape and refine the research question to ensure that it reflects real-world concerns, provide input on inclusion and exclusion criteria to capture diverse evidence forms, and assist in interpreting findings through the lens of lived experience. For example, in a scoping review on social media, the authors interviewed topical experts regarding whether the findings were consistent with their experience as social media researchers.41 Community members also may help identify important gray literature, codevelop dissemination strategies to reach intended audiences, and offer feedback on the clarity and accessibility of draft manuscripts. Despite these benefits, engaging community members can also introduce challenges, such as an extended project timeline, the need for training and support, and the difficulty of identifying and matching the right individuals to the appropriate knowledge synthesis steps.40 Thoughtful planning and clear communication can help address these challenges and facilitate meaningful community member engagement.
Lack of Readiness for Practice
Knowledge syntheses in medical education have been criticized for lacking relevance to, and for not being ready for translation into, practice.2,42 When syntheses fail to address educators’ practical realities, they risk becoming disconnected from the learning environments where they are meant to be used. Similarly, knowledge syntheses that lack clear, practical recommendations may leave educators unsure how to implement findings.43 For example, a knowledge synthesis that concludes “more research is needed” but does not help a reader make sense of the evidence presented is both frustrating and of limited use. One approach to address these issues is for authors to draw on the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework,44 which facilitates the translation of research into practice.45 While this framework can guide multiple knowledge synthesis steps, it is particularly helpful for data extraction. For this purpose, authors can use the published RE-AIM extraction template that facilitates extracting data relevant to the dimensions of the framework.46,47 For example, a review on diabetes health coaching used the RE-AIM template to “elucidate the critical aspects of an intervention to ensure the adoption, scaling, and maintenance of an intervention.”48
Not Looking Beyond Usual Sources
Identifying medical education literature can be challenging due to its dispersion across biomedical, education, and social science journals, as well as conference proceedings and gray literature sources. Compounding this difficulty is the inconsistency in indexing terms used to describe relevant publications, which can hinder comprehensive searching.49,50 To ensure broad and inclusive retrieval, authors should look beyond usual sources and adopt a multiple database search strategy that includes MEDLINE, ERIC, EMBASE, PsycINFO, CINAHL, Scopus, and others, as appropriate. Additionally, preprint servers should be considered, because medical education literature is increasingly being disseminated as preprints prior to peer review.51 For example, medRxiv includes a dedicated category for medical education and can be a valuable source of emerging literature. Importantly, authors also should make intentional efforts to include non-English-language publications. Restricting knowledge syntheses to English-only sources, which is a common tendency,52,53 can result in a narrow understanding of educational practices, privileging certain systems, values, and assumptions.
Living Reviews
Medical education knowledge syntheses, like those in other fields,54 have been criticized for being infrequently updated,42 which means they can be outdated and inaccurate. One approach has been to use living reviews. Living reviews are a form of knowledge synthesis that is continually updated to incorporate new evidence as it becomes available, ensuring that the review remains current and relevant over time.55 Unlike traditional reviews, which represent the available evidence at a single point, living reviews are particularly useful in rapidly evolving fields or when timely guidance is needed for policy or practice.55 For example, multiple living reviews are aimed at answering critical questions related to COVID-19.56 Living reviews can enhance the usefulness of syntheses by reducing the lag between evidence generation and use, but they also demand sustained resources, clear protocols for when and how updates are made, and effective communication strategies to signal changes to end users.57 Despite these challenges, living reviews hold promise for improving the responsiveness and relevance of medical education research, especially in areas where guidance must evolve alongside emerging data.
Knowledge Translation
Knowledge syntheses are traditionally disseminated as journal articles, which can be jargon-dense and inaccessible behind subscription paywalls, limiting their practical impact. To enhance uptake, authors should consider how to move beyond traditional dissemination by using knowledge translation strategies tailored to their audiences. For example, visual summaries such as infographics, conceptual frameworks, or summary tables can distill complex findings into formats more usable by time-constrained educators.1 Similarly, plain-language summaries can help bridge the gap between research and practice, making evidence more understandable to nonspecialist stakeholders, including learners or institutional leaders.58 Dissemination via podcasts and social media platforms like Twitter/X or LinkedIn also can expand reach and stimulate discussion.41 More interactive approaches, such as developing searchable databases of studies or cohosting workshops to interpret findings with stakeholders, can foster deeper engagement and cocreation of actionable outcomes.25 Toolkits and practical implementation guides are additional formats that translate findings into actionable steps.59
Artificial Intelligence
The use of AI to conduct knowledge syntheses can streamline resource-intensive tasks such as literature searching and screening, data extraction, and evidence summarization.60 For example, one author team reported completing a systematic review in 2 weeks using AI61 to search for and determine inclusion of articles and draft the manuscript, saving over a year compared to typical timelines for human-only conducted syntheses.37 While these efficiencies are compelling, AI has limitations, and individuals are encouraged to use it thoughtfully and appropriately.62 AI tools may introduce bias, lack contextual understanding, or misinterpret nuanced data, especially in the oftentimes complex and nuanced step of data extraction.63 Thus, adopting an approach in which AI supplements, but does not replace human involvement is critical. Humans remain vital for interpreting context, assessing relevance, and ensuring methodological rigor. Without human oversight, AI-generated outputs may appear convincing but lack the depth, accuracy, or educational relevance needed for meaningful knowledge translation. Additionally, authors should select AI tools carefully, ensuring that what is selected is appropriate for their task. As the use of AI continues to evolve, researchers should consider using AI but remain attentive to both its possibilities and limitations.
Knowledge syntheses are vital to medical education. When conducted thoughtfully, they not only inform curriculum design, teaching strategies, and policy decisions but also bridge the gap between research and practice by translating complex evidence into actionable insights. To realize their full potential, knowledge syntheses must be inclusive, methodologically sound, and focused on relevance and impact. As the field evolves, integrating innovations such as community engagement, AI, and living reviews will help ensure that syntheses remain timely and applicable.
References
-
Chapman E, Haby MM, Toma TS, et al. Knowledge translation strategies for dissemination with a focus on healthcare recipients: an overview of systematic reviews.
Implement Sci. 2020;15(1). doi:10.1186/s13012-020-0974-3
-
Gordon M, Carneiro AV, Patricio MF. Enhancing the impact of BEME systematic reviews on educational practice.
Med Teach. 2015;37(8):789–790. doi:10.3109/0142159X.2015.1042437
-
Maggio LA, Costello JA, Norton C, Driessen EW, Artino Jr AR. Knowledge syntheses in medical education: a bibliometric analysis.
Perspect Med Educ. 2021;10(2):79–87. doi:10.1007/S40037-020-00626-9
-
Maggio LA, Sewell JL, Artino AR Jr. The literature review: a foundation for high-quality medical education research.
J Grad Med Educ. 2016;8(3):297–303. doi:10.4300/JGME-D-16-00175.1
-
Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies.
Health Info Libr J. 2009;26(2):91–108. doi:10.1111/j.1471-1842.2009.00848.x
-
Tricco AC, Soobiah C, Antony J, et al. A scoping review identifies multiple emerging knowledge synthesis methods, but few studies operationalize the method.
J Clin Epidemiol. 2016;73:19–28. doi:10.1016/j.jclinepi.2015.08.030
-
-
Cooke A, Smith D, Booth A. Beyond PICO: the SPIDER tool for qualitative evidence synthesis.
Qual Health Res. 2012;22(10):1435–1443. doi:10.1177/1049732312452938
-
Cumpston M, Flemyng E, Thomas J, et al. Introduction.
: Higgins JPT, Thomas J, eds. Cochrane Handbook for Systematic Reviews of Interventions. Cochrane; 2024 . https://training.cochrane.org/handbook
-
Jay R, Sandars J, Patel R, et al. The use of virtual patients to provide feedback on clinical reasoning: a systematic review.
Acad Med. 2025;100(2):229–238. doi:10.1097/ACM.0000000000005908
-
Deeks JJ, Higgins JPT, Altman DG, McKenzie JE, Veroniki AA. Chapter 10: Analysing data and undertaking meta-analyses.
: Higgins JPT, Thomas J, eds. Cochrane Handbook for Systematic Reviews of Interventions Version. Cochrane; 2024 . https://training.cochrane.org/handbook
-
He L, Yang N, Xu L, et al. Synchronous distance education vs traditional education for health science students: A systematic review and meta-analysis.
Med Educ. 2021;55(3):293–308. doi:10.1111/medu.14364
-
-
Maggio LA, Samuel A, Stellrecht E. Systematic reviews in medical education.
J Grad Med Educ. 2022;14(2):171–175. doi:10.4300/JGME-D-22-00113.1
-
Page MJ, McKenzie JE, Bossuyt PM, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews.
BMJ. 2021;372(71):n71. doi:10.1136/bmj.n71
-
Maggio LA, Larsen K, Thomas A, Costello JA, Artino AR Jr. Scoping reviews in medical education: a scoping review.
Med Educ. 2021;55(6):689–700. doi:10.1111/medu.14431
-
Arksey H, O’Malley L. Scoping studies: towards a methodological framework.
International Journal of Social Research Methodology. 2005;8(1):19–32. doi:10.1080/1364557032000119616
-
Thomas A, Lubarsky S, Durning SJ, Young ME. Knowledge syntheses in medical education: demystifying scoping reviews.
Acad Med. 2017;92(2):161–166. doi:10.1097/ACM.0000000000001452
-
Gordon M, Daniel M, Ajiboye A, et al. A scoping review of artificial intelligence in medical education: BEME guide No. 84.
Medical Teacher. 2024;46(4):446–470. doi:10.1080/0142159X.2024.2314198
-
Phillips JP. Scoping reviews: mapping new terrain in family medicine education.
Fam Med. 2020;52(4):241–242. doi:10.22454/FamMed.2020.477073
-
Meyer R, Archer E, Smit L. The positive influence of the hidden curriculum in medical education: a scoping review.
Med Sci Educ. 2025;35(3):1817–1826. doi:10.1007/s40670-025-02380-1
-
Maggio LA, Thomas A, Durning SJ. Knowledge Syntheses.
: Swanwick T, O’Brien BC, eds. Understanding Medical Education. 3rd ed. Wiley; 2018. 10.1002/9781119373780
-
Levac D, Colquhoun H, O’Brien KK. Scoping studies: advancing the methodology.
Implement Sci. 2010;5(1):69. doi:10.1186/1748-5908-5-69
-
-
Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation.
Ann Intern Med. 2018;169(7):467–473. doi:10.7326/M18-0850
-
Pollock M, Fernandes RM, Becker LA, Pieper D, Hartling L. Overviews of reviews.
: Higgins JPT, Thomas J, eds. Cochrane Handbook for Systematic Reviews of Interventions. Cochrane; 2024 . https://training.cochrane.org/handbook
-
Samuel A, Cervero RM, Durning SJ, Maggio LA. Effect of continuing professional development on health professionals’ performance and patient outcomes: a scoping review of knowledge syntheses.
Acad Med. 2021;96(6):913–923. doi:10.1097/ACM.0000000000003899
-
-
Onyura B, Baker L, Cameron B, Friesen F, Leslie K. Evidence for curricular and instructional design approaches in undergraduate medical education: An umbrella review.
Med Teach. 2016;38(2):150–161. doi:10.3109/0142159X.2015.1009019
-
Ioannidis JPA. Integration of evidence from multiple meta-analyses: a primer on umbrella reviews, treatment networks and multiple treatments meta-analyses.
CMAJ. 2009;181(8):488–493. doi:10.1503/cmaj.081086
-
Aromataris E, Fernandez R, Godfrey CM, Holly C, Khalil H, Tungpunkom P. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach.
Int J Evid Based Healthc. 2015;13(3):132–140. doi:10.1097/XEB.0000000000000055
-
Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review--a new method of systematic review designed for complex policy interventions.
J Health Serv Res Policy. 2005;10 Suppl 1:21–34. doi:10.1258/1355819054308530
-
Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: realist syntheses.
BMC Med. 2013;11(1):21. doi:10.1186/1741-7015-11-21
-
Wong G, Greenhalgh T, Westhorp G, Pawson R. Realist methods in medical education research: what are they and what can they contribute?
Med Educ. 2012;46(1):89–96. doi:10.1111/j.1365-2923.2011.04045.x
-
Krystallidou D, Kersbergen MJ, de Groot E, et al. Interprofessional education for healthcare professionals. A BEME realist review of what works, why, for whom and in what circumstances in undergraduate health sciences education: BEME Guide No. 83.
Med Teach. 2024;46(12):1607–1624. doi:10.1080/0142159X.2024.2314203
-
Ajjawi R, Kent F. Understanding Realist Reviews for Medical Education.
J Grad Med Educ. 2022;14(3):274–278. doi:10.4300/JGME-D-22-00334.1
-
Borah R, Brown AW, Capers PL, Kaiser KA. Analysis of the time and workers needed to conduct systematic reviews of medical interventions using data from the PROSPERO registry.
BMJ Open. 2017;7(2). doi:10.1136/bmjopen-2016-012545
-
Pham B, Bagheri E, Rios P, et al. Improving the conduct of systematic reviews: a process mining perspective.
J Clin Epidemiol. 2018;103:101–111. doi:10.1016/j.jclinepi.2018.06.011
-
-
Motu’apuaka M, Whitlock E, Kato E, et al. Defining the benefits and challenges of stakeholder engagement in systematic reviews.
CER. 2015;5:13. doi:10.2147/CER.S69605
-
Chan TM, Dzara K, Dimeo SP, Bhalerao A, Maggio LA. Social media in knowledge translation and education for physicians and trainees: a scoping review.
Perspect Med Educ. 2020;9(1):20–30. doi:10.1007/S40037-019-00542-7
-
Maggio LA, Thomas A, Chen HC, et al. Examining the readiness of best evidence in medical education guides for integration into educational practice: A meta-synthesis.
Perspect Med Educ. 2018;7(5):292–301. doi:10.1007/S40037-018-0450-9
-
Onyura B, Légaré F, Baker L, et al. Affordances of knowledge translation in medical education: a qualitative exploration of empirical knowledge use among medical educators.
Acad Med. 2015;90(4):518–524. doi:10.1097/ACM.0000000000000590
-
Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework.
Am J Public Health. 1999;89(9):1322–1327. doi:10.2105/ajph.89.9.1322
-
Holtrop JS, Estabrooks PA, Gaglio B, et al. Understanding and applying the RE-AIM framework: Clarifications and resources.
J Clin Transl Sci. 2021;5(1). doi:10.1017/cts.2021.789
-
Harden SM, Gaglio B, Shoup JA, et al. Fidelity to and comparative results across behavioral interventions evaluated through the RE-AIM framework: a systematic review.
Syst Rev. 2015;4(1). doi:10.1186/s13643-015-0141-0
-
-
Racey M, Jovkovic M, Alliston P, Sherifali D. Applying the RE-AIM implementation framework to evaluate diabetes health coaching in individuals with type 2 diabetes: a systematic review and secondary analysis.
Front Endocrinol. 2022;13:1069436. doi:10.3389/fendo.2022.1069436
-
Maggio LA, Ninkov A, Frank JR, Costello JA, Artino AR Jr. Delineating the field of medical education: Bibliometric research approach(es).
Med Educ. 2022;56(4):387–394. doi:10.1111/medu.14677
-
Amar-Zifkin A, Ekmekjian T, Paquet V, Landry T. Algorithmic indexing in MEDLINE frequently overlooks important concepts and may compromise literature search results.
J Med Libr Assoc. 2025;113(1):39–48. doi:10.5195/jmla.2025.1936
-
Maggio LA, Costello JA, Artino AR Jr. Describing the landscape of medical education preprints on medrxiv: current trends and future recommendations.
Acad Med. 2024;99(9):981–986. doi:10.1097/ACM.0000000000005742
-
Stern C, Kleijnen J. Language bias in systematic reviews: you only get out what you put in.
JBI Evidence Synthesis. 2020;18(9):1818–1819. doi:10.11124/JBIES-20-00361
-
Walpole SC. Including papers in languages other than English in systematic reviews: important, feasible, yet often omitted.
J Clin Epidemiol. 2019;111:127–134. doi:10.1016/j.jclinepi.2019.03.004
-
Garner P, Hopewell S, Chandler J, et al. When and how to update systematic reviews: consensus and checklist.
BMJ. 2016;354. doi:10.1136/bmj.i3507
-
Elliott JH, Synnot A, Turner T, et al. Living systematic review: 1. Introduction-the why, what, when, and how.
J Clin Epidemiol. 2017;91:23–30. doi:10.1016/j.jclinepi.2017.08.010
-
De Silva K, Turner T, McDonald S. Cochrane’s COVID-19 living systematic reviews: a mixed-methods study of their conduct, reporting and currency.
Cochrane Evid Synth Methods. 2025;3(3). doi:10.1002/cesm.70024
-
Thomas J, Noel-Storr A, Marshall I, et al. Living systematic reviews: 2. Combining human and machine effort.
J Clin Epidemiol. 2017;91:31–37. doi:10.1016/j.jclinepi.2017.08.011
-
Wilson MG, Lavis JN, Travers R, Rourke SB. Community-based knowledge transfer and exchange: helping community-based organizations link research to action.
Implement Sci. 2010;5:33. doi:10.1186/1748-5908-5-33
-
Graham ID, Logan J, Harrison MB, et al. Lost in knowledge translation: Time for a map?
J Contin Educ Health Prof. 2006;26(1):13–24. doi:10.1002/chp.47
-
Blaizot A, Veettil SK, Saidoung P, et al. Using artificial intelligence methods for systematic review in health sciences: A systematic review.
Res Synth Methods. 2022;13(3):353–362. doi:10.1002/jrsm.1553
-
Clark J, Glasziou P, Del Mar C, Bannach-Brown A, Stehlik P, Scott AM. A full systematic review was completed in 2 weeks using automation tools: a case study.
J Clin Epidemiol. 2020;121:81–90. doi:10.1016/j.jclinepi.2020.01.008
-
van Dijk SHB, Brusse-Keizer MGJ, Bucsán CC, van der Palen J, Doggen CJM, Lenferink A. Artificial intelligence in systematic reviews: promising when appropriately used.
BMJ Open. 2023;13(7). doi:10.1136/bmjopen-2023-072254
-
Khalil H, Ameen D, Zarnegar A. Tools to support the automation of systematic reviews: a scoping review.
J Clin Epidemiol. 2022;144:22–42. doi:10.1016/j.jclinepi.2021.12.005
There are no comments for this article.