When we work with small groups of learners in our home environments, we can easily find ourselves so deep in the weeds of local implementation that we miss the forest at the edge of the field. Occasionally, we need to take a breath, look up, set our gaze on the horizon, and see how our work fits into the national and international contexts. A scoping review is a way of mapping these larger contexts that form the landscape of a problem. Such a map can help us see the whole, from a perspective we miss when we are on the ground.
Competency-based medical education has become the new standard for implementing and assessing family medicine graduate medical education in the United States and Canada. Both countries have set national guidelines for competency expectations, and adherence to these guidelines has become the standard of educational practice.1 Many of us in residency programs in both countries are working hard to implement competency-based models in a way that is meaningful in our local circumstances. In this issue of Family Medicine, Craig Campbell, MD and colleagues describe a map to help us with this implementation in both family medicine residencies and continuing professional development.
While there is a lot of research about developing the foundations for implementation of competency-based education, few studies have focused on actual implementation. Those that do have tended to focus on the fidelity of implementation—that is, whether curricula and evaluation were being implemented as intended. Most of these have focused on faculty development and the process of adapting old curricula to meet the new standard. Only one study actually described the outcomes of competency-based medical education. Further, the vast majority of studies were implemented at the program or institutional level; none evaluated the impact of competency-based education on individual learners. The authors also found no research examining competency-based education for practicing physicians.1
Taken as a whole, Campbell et al suggest that much more work needs to be done to understand the value of competency-based medical education. We have been working to implement a new process based on an attractive theory, but we are doing so without evidence of improved learning, or better-trained residents. We are in the middle of an international experiment, and we are not measuring its outcomes at the level that matters most. Working in the weeds and seeing that research is being conducted all around us, it would be easy to miss this important message. The map reminds us that as we work, we should not just describe processes, but also evaluate impacts.
Scoping reviews differ from systematic reviews in key ways. While systematic reviews typically attempt to answer a focused research question, scoping reviews approach a broader problem. The nature and process of the literature search and review may evolve over the course of the scoping review as the authors attempt to follow the clues they find.2 Scoping reviews do not delve deeply into the quality of articles, and do not come to definitive, evidence-based conclusions. Rather, they describe the landscape of the research. What kinds of studies have been conducted? What questions have been asked? Where is the research being done? What are the key concepts and subtopics? What terms and definitions are being used? Perhaps most importantly, where are the barren patches of land, ripe for planting, that no one has noticed? Scoping reviews show us gaps in the literature, identifying new areas of study and innovation.3
In medical education, we are often challenged by limited time and resources to conduct research. Typically, educating learners is our primary goal, and the evaluation of our work is a lower priority. Many of us are physicians with little or no formal research training, and we may be intimidated by the research process. For all these reasons, it is easy for us to only imagine what we have already seen and conduct the same types of research over and over. As a result, many of our educational practices have little or no evidence to support their efficacy.
Scoping reviews are gaining in popularity. When searching across all of PubMed for articles with “scoping” in the title, one can see the exponential growth in the number of hits—more than 1,600 publications in 2019. Yet, a simple PubMed search for “scoping” (in title or abstract) identified only one study previously published in Family Medicine,4 and none published in Annals of Family Medicine. In family medicine, perhaps more than any other discipline, we understand the value of taking a broad perspective, so we should do more of this work.
Because of their breadth, scoping reviews are difficult to conduct, and are likely to require substantial investment of resources—mostly the time and patience of the investigators. Wandering around in a large territory, trying to make sense of a diverse landscape can be overwhelming; reviews like this are not for the faint of heart. At the same time, researchers who embark on a scoping adventure can look forward to unexpected discoveries.
From the work of Campbell and colleagues, we can conclude that more scholarship is needed in several specific areas. First, the field lacks a common framework defining what it means to be a competency-based medical education program. Development of a consensus definition will allow researchers to create tools and measures to evaluate such programs. Researchers should also consider examining new approaches to clinical teaching in a competency-based context; and they should consider how competency-based education integrates with practice improvement. Educators should evaluate learners’ attainment of competencies as research outcomes, particularly focusing on learning at the individual level. Finally, more work is needed evaluating competency-based education as an approach to practicing physicians’ continuing professional development.1
Our discipline needs more scoping reviews, because their value is unique. Without maps, how will we know where to go?
There are no comments for this article.