Implementing Competency-Based Medical Education in Family Medicine: A Narrative Review of Current Trends in Assessment

Natalia Danilovich, PhD, MD, MSc | Simon Kitto, PhD | David W. Price, MD | Craig Campbell, MD | Amanda Hodgson, MLIS | Paul Hendry, MD, MSc

Fam Med. 2021;53(1):9-22.

DOI: 10.22454/FamMed.2021.453158


Background and Objectives: The implementation of effective competency-based medical education (CBME) relies on building a coherent and integrated system of assessment across the continuum of training to practice. As such, the developmental progression of competencies must be assessed at all stages of the learning process, including continuing professional development (CPD). Yet, much of the recent discussion revolves mostly around residency programs. The purpose of this review is to synthesize the findings of studies spanning the last 2 decades that examined competency-based assessment methods used in family medicine residency and CPD, and to identify gaps in their current practices.

Methods: We adopted a modified form of narrative review and searched five online databases and the gray literature for articles published between 2000 and 2020. Data analysis involved mixed methods including quantitative frequency analysis and qualitative thematic analysis.

Results: Thirty-seven studies met inclusion criteria. Fourteen were formal evaluation studies that focused on the outcome and impact evaluation of assessment methods. Articles that focused on formative assessment were prevalent. The most common levels of educational outcomes were performance and competence. There were few studies on CBME assessment among practicing family physicians. Thematic analysis of the literature identified several challenges the family medicine educational community faces with CBME assessment.

Conclusions: We recommend that those involved in health education systematically evaluate and publish their CBME activities, including assessment-related content and evaluations. The highlighted themes may offer insights into ways in which current CBME assessment practices might be improved to align with efforts to improve health care.

Competency-based medical education (CBME) is an outcomes-based approach to the design, implementation, assessment, and evaluation of an educational program. It uses an organized framework of competencies, that ultimately may lead to better care for patients.1 By definition, CBME demands a robust and multifaceted assessment system that embraces continuous evaluation of trainees.2,3 Accurate assessment of competence, or the “meaningful assessment of competence,”4 requires continuing, serial, and direct observation of workplace behavior and monitoring of progress based on feedback.3,5

Since the implementation of effective CBME depends on mounting high-quality assessment practices, building a coherent and integrated system of assessment across the continuum of training to practice is critical.6 As such, the developmental progression of competencies must be assessed at all stages of the learning process, including continuing professional development (CPD).6-11 Yet much of the recent discussion around CBME implementation revolves mostly around residency programs,1,12-15 including family medicine residency programs in Canada16-19 and the United States.20-23 Our recent scoping review24 on the current state of CBME implementation in family medicine residency and CPD programs has also demonstrated the paucity of studies on the applicability or relevance of CBME to physicians in practice.

Family medicine was one of the earliest adopters of CBME framework in Canada and the United States with its programs being implemented into residency training on a nationwide scale.12,16,22,25,26 The challenge is that the studies that have focused on examining competency-based assessment tools (eg, identifying challenges, providing criteria, basic principles, and guidance for good assessment),4,6,11,27-30 have not paid sufficient attention to the overall implementation of assessment practices, particularly in family medicine. In other words, there are no studies that have thematically and systematically synthesized the current family medicine literature pertaining to the overall CBME assessment implementation and identified gaps in existing knowledge.

The purpose of this review was to synthesize the findings of studies spanning the last 2 decades that examined competency-based assessment methods used in family medicine residency and CPD, in order to identify gaps in current practices in implementing competency-based assessment tools. Given the breadth of this topic, we specifically focused on the following research questions:

  1. What competency frameworks were described in the included studies?
  2. What assessment methods and assessment systems were discussed?
  3. Was the purpose of the assessment formative, summative, or a combination?
  4. What types of educational outcomes, outcome frameworks, and outcome levels were described in the family medicine literature?

For the purpose of this paper, we focused primarily in CBME for use in CPD in family medicine. However, little has been written about practical CBME implementation within CPD in any specialty,11,31,32 particularly in family medicine.33 Therefore, we also searched the residency training in family medicine literature to identify potentially pertinent and transferable findings to CPD. The findings will facilitate our understanding of how CBME implementation practices in residency training might inform the design and operationalization of CBME implementation strategies in CPD.


We used a narrative review approach34-40 to thematically synthesize the current literature and identify gaps in existing knowledge. The present review sets about to identify current studies on competency-based assessment implementation in family medicine residency and CPD.

To avoid potential pitfalls associated with narrative reviews, such as selection bias, lack of diversity in sources, or drawing conclusions based more on opinion than data, the preparation of narrative reviews must apply the methodological rigor of systematic reviews.35,37,40-42 Specifically, we employed a systematic approach to both steps: (1) selecting studies to be included, and (2) extracting information from primary articles.35,43,44 A systematic approach implies grouping and analyzing sources with similar findings and/or the same level of evidence. This can be done by placing data from the selected sources in tables and analyzing the data in the main body, without duplicating information.42

A modified form of narrative review, as described by Popay et al,45 Ferrari,37 and Baethge et al,40 was adopted whereby data extraction enabled synthesis of key data, while also allowing rich narrative description.37 We adapted the the steps involved and outlined below from Shaw et colleagues,35 and our research team modified them.

Search Strategy

A preliminary search of the literature was undertaken to see what work in the area of interest had been published and to verify that no similar review had been published already.35,44,46 Our scoping search revealed no existing reviews of the state of knowledge regarding CBME assessment implementation in family medicine residency and/or CPD.

To improve the method of literature selection and to reduce the risk of suboptimal reporting,37 our modified search strategy employed components from a systematic review methodology (PRISMA) which involves screening titles and abstracts as well as data extraction techniques.47 The strategy (selecting the databases and defining the inclusion and exclusion criteria and the search terms) was developed by authors S.K., N.D., and the Health Sciences librarian A.H. The initial search (from 2000 through April 30, 2017) was conducted on April 28, 2017, and a follow-up search was done on May 5, 2020, using the following five electronic databases: MEDLINE, ERIC, PsycINFO, Embase, and EdSource. Additionally, we searched government-related and relevant professional organizations’ websites. As the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project started in 1999, we restricted our search to the literature published after 2000 to capture information that was relevant to CBME in family medicine. We included all original research, review articles, editorials/commentaries, and regulatory papers.48 The full search strategy and the specific search terms, that were identified through input from the research team and an academic librarian, are provided in the Supplemental Digital Appendix (https://journals.stfm.org/media/3555/hendry-supp-appendix1.pdf).

Inclusion and Exclusion Criteria

Studies that fulfilled the following criteria were included:

  1. focused exclusively on family medicine residency or CPD programs. Any undergraduate medical education article was excluded;
  2. discussed CBME;
  3. were American or Canadian in origin;
  4. published in English;
  5. available in a full-text version (the articles were requested through the library if immediate full-text versions were not discovered).

Study Selection

The first reviewer (N.D.) screened and read the titles and abstracts of all identified studies.43,44 Studies that fell under the exclusion criteria were removed, as were duplicates and studies that did not meet the inclusion criteria. N.D. screened full-text documents and excluded those not relevant. Questionable cases were read by the second reviewer (S.K.), and a joint decision was reached on whether to include them or not for further review.

Data Extraction

Step 1. We developed a first version of a standardized data extraction form based on the literature.4,5,18,49-55 We piloted the data extraction form with all team members three times.

Step 2. A final data extraction form consisted of two parts. We created the first part to draw key demographic characteristics from the articles (publication year, publication type, study design, year, author, country, target audience; see Table 1). The second part of the extraction form included the coding concepts developed through a review of the published literature and revised in consultation with the research team. The definitions for these coding concepts (competency framework, assessment method, assessment purpose, outcomes model, type of educational outcome, and level of outcome) are shown in Table 2.4,5,16,18,27,56-82 N.D. collected the data on general characteristics of the articles. Furthermore, several reviewers independently extracted the key elements of CBME assessments; the discussion method was used first and if disagreements could not be resolved, further checks were made. If there were any ambiguous items, the principle investigator (S.K.) reviewed the article, and made the final decision. We tabulated and analyzed the data.

Data Analysis

We used an inductive approach36,44,83,84 to reflect upon the “landscape of events”85 of competency-based assessment in family medicine residency and CPD in a North American context. The themes identified were strongly linked to the raw data and were not necessarily related to the research questions posed.34,86 The expert/research team members discussed the results of the review to gain an overall understanding of the trends and nature of competency-based assessments evident in family medicine residency and CPD literature during the last 2 decades.36,84


Study Selection

The original search yielded 1,222 potentially relevant citations. After deduplication and relevance screening, 185 citations met the eligibility criteria based on title and abstract and the corresponding full-text articles were procured for review. After data characterization of the full-text articles, we retained 37 articles in the analysis (Figure 1), with 148 being excluded for one of the following reasons: no evidence of CBME assessment tools description (n=62); no evidence of the CBME concept used (n=30); irrelevant to family medicine field (n=27); oral/poster presentations (n=17); not a Canadian or US article (n=7); and undergraduate medical education (n=5).

Study Characteristics

The 37 studies5,18,20,33,55,60-63,66,67,69,87-111 were published between January 2000 and May 2020, with nearly half of the articles (17/37; 46%) published within the past 5 years (2015-2020). The general characteristics of the articles included in this study are reported in Table 1. The majority (21/37; 57%) of all studies originated from the United States. Research articles (24/37; 65%) and commentary/reflective papers (8/37; 22%) comprised most documents; articles characterized as editorial (2/37; 5%), review (2/37; 5%), and regulatory (1/37; 3%) were underrepresented. Among 24 studies eligible for classification by a type of research paradigm, 50% (12/24) used a quantitative approach, 38% (9/24) used a qualitative approach, and the remaining used mixed methods (3/23; 12%).

Almost all studies (36/37; 97%) were identified in published literature. Only one article was found in the gray literature; the article was published on the College of Family Physicians of Canada (CFPC) website. The articles were published in 13 different journals, although two-thirds of the papers (25/37; 67%) were concentrated in three journals. These included 37% (14/37) of studies published in Family Medicine, 19% (7/37) in Canadian Family Physician, and 11% (4/37) in Academic Medicine. More than half of the studies (23/37; 62%) reported residents being their target population followed by both faculty and residents (11/37; 30%), and family physicians (3/37; 8%).

Frequency of Coding Concepts Across the Data Set: Competency Framework, Assessment Method, Assessment Purpose, Type of Educational Outcome, Level of Outcome, and Stage of Assessment Implementation

Table 2 provides the definitions and summarizes coding frequencies of the eight coding concepts across the data set. The ACGME/ABMS framework was discussed more frequently (21/37; 57%), followed by the CanMEDS-FM (14/37; 38%) and Triple C frameworks (2/37; 5%). While the majority of the articles (20/37; 54%) focused on individual assessment methods, the remaining studies discussed assessment systems (13/37; 35%) as reviewed in Table 2. Half of the studies (20/37; 54%) focused solely on formative purpose of assessment, but 13 (35%) papers discussed both formative and summative assessments. None of the studies concentrated exclusively on summative assessment.

The coding concept “outcomes model” was retrieved from one source only. We mapped the different types of educational outcomes (highlighted in Table 2) to the assessment frameworks of Moore et al.,50 Miller,74 and Kirkpatrick and Kirkpatrick,75 as suggested by Price et al.73 The majority (17/37; 46%) of studies targeted program-level followed by individual-level outcomes (12/37; 32%). Eight papers (22%) described both levels of outcomes. We grouped all articles by a stage of assessment method implementation, ranging from the stage “development of assessment methods” to the stages “implementation and initial evaluation of assessment methods,” and “outcome and impact evaluation of assessment method.” Nearly half of the studies (18/37; 40%) described assessment tools at their initial stage of implementation (eg, defining, designing, and planning of assessment instruments), while the other half (19/37; 51%) discussed partially implemented (5/37; 13%) or fully implemented and evaluated assessment methods (14/37; 38%).

Differences and Similarities Between Family Medicine Residency and CPD

Table 2 also highlights the frequency of the coding concepts among the three categories of the articles: residency articles, CPD articles, and CPD/residency articles. Direct observation was the only assessment method described in CPD articles, while CPD/residency studies more often discussed multiple assessment tools (3/11; 27%). Yet, residency articles equally focused on the two assessment methods: direct observation (4/37; 17%) and competency-based achievement system (4/37; 17%). The individual level of educational outcomes was used more often in CPD/residency (6/11; 55%), whereas the program level outcomes were most common in residency (12/23; 52%) and CPD studies (3/3; 100%). Finally, all CPD and almost half of residency studies (11/23; 48%) described assessment tools at their initial stage of implementation (eg, defining, designing, and planning of assessment instruments). In contrast, the majority of CPD/residency papers (5/11; 46%) discussed fully implemented and evaluated assessment methods.

Emergent Themes

Through the process of iterative reading and discussion of the literature by the research team, three broad themes emerged: ways to improve assessment methods, assessors’ needs and challenges, and learners’ needs and challenges. A summary of the themes and subthemes along with frequency counts are presented in Table 3.


The purpose of this review was to synthesize the findings of studies spanning the last 2 decades that examined the competency-based assessment methods used in family medicine residency and CPD, in order to identify gaps in current practices in implementing CBME assessments. Our analysis shows that there is a very small body of published work on competency-based assessments in family medicine residency and CPD.

The following discussion highlights the three major implications of our analysis: (1) trends in competency-based assessment, (2) challenges to implementing competency-based assessment, and (3) key elements for supporting competency-based CPD.

Trends in Competency-Based Assessment

The paucity of articles (14/37; 38%) reporting reliability of assessment methods, intention to use them, and impact of assessment instruments on residents and faculty is not surprising, given the lack of frameworks that define a CBME program,112,113 inconsistency around the CBME language,114-116 the difficulties inherent in assessing competence,5 and a limited focus on a broad range of issues related to fidelity of CBME impementation.24 Articles that focused solely on formative assessment were most prevalent among all three categories of articles (Table 2). This may reflect that CBME programs are paying increasing attention to competencies beyond knowledge. It may also reflect the use of formative rather than summative in-training examinations in residency programs.

The most common types of educational outcomes were performance (23/37; 62%) and competence (8/37; 22%; Levels 5 and 4 of Moore et al’s pyramid50), suggesting that a portfolio of formative assessment techniques has been shown to be effective at measuring competence and performance within any proposed outcomes-based framework.50,100 In contrast, we found no studies on the assessment of patient and community health outcomes in our search, which may be explained by multiple challenges related to their measurement (eg, lack of available data, compounding due to multiple interventions).117 While summative assessment techniques are well established and have been proven effective for measuring knowledge, these assessment methods were less commonly identified in our review (13/37; 35%). Given program directors’ need to sign off on their residents’ preparedness to enter practice, this percentage (35%) appears low from a residency perspective. From a CPD perspective, this is rather high, because outside of board certification/maintenance of certification, most CPD programs were not designed for summative evaluation, but rather to promote continuous improvement of practice.

Challenges to Implementing Competency-Based Assessment

Based on the thematic analysis of the literature, we identified that the Family Medicine educational community faces several challenges with CBME assessment (Table 3). Those challenges reflect some of the current trends in the general medical educational literature regarding competency-based assessment.4,11,118,119

We found that the most popular concern for assessors was the need for standardised training for assessment. There were calls for faculty development training to focus on the effective use of field notes and observation, better interrater consistency and reliability, training on how to effectively teach and evaluate the intrinsic roles (communicator, collaborator, leader, health advocate, scholar, and professional), and training on being faculty advisors. These results are in line with many studies identifying gaps in skills required by faculty to consistently assess competencies as part of the redesign of residency training programs.28,120,121 Additionally, faculty overload was mentioned as one of the major challenges, particularly the ability to integrate assessments within workflow in a clinical setting, which is in agreement with previous studies.27,122 In contrast, the most common concern for learners was the need for better constructive feedback, followed by the need for residents’ [to receive] better orientation vis-à-vis expectations and processes prior to the start of the sessions. It has been shown that learners actively seek out critical feedback to help them accomplish the competency goals in order to advance through the stages of formal education.123 The theme of “ways to improve assessment methods” included several subthemes; “having end-user input for the evaluation of the assessment tools” was the most frequently-cited topic. Feedback from residents, advisors, program directors, program and site administrators, and off-service preceptors was suggested as a means of improving the assessment tools. The active engagement of learners in their own assessment and feedback from faculty are both important, as they ensure the ultimate acceptability of an assessment tool by the key stakeholders.5,124-127

Key Elements for Supporting Competency-Based CPD

Finally, despite continuing discussion about the role of CPD in building CBME assessment across the continuum of training to practice,4,6,31,128 the main gap in the educational literature we uncovered was a total absence of studies related to the ongoing assessment of practicing physicians. This seems to be in line with the overall lack of formal evaluation studies in family medicine CPD reported previously by our group.33 Likewise, our recent scoping review has revealed a shortage of scholarships on CBME implementation practices within family medicine CPD.24 One elephant in the room is the disjointed nature of contemporary medical education129—the silos that exist between undergraduate, graduate, and continuing medical education. Given that the continuum of learning ideal has yet to be realized, we advocate for closer collaboration among stakeholder organizations responsible for each level of medical training in order to design a comprehensive system of assessment.6,33,129 Additionally, the integration of efforts across the four domains (continuing education, knowledge translation, patient safety, and quality improvement) should aim to develop a systematic approach to competency assessment across the continuum of health professions education and practice.6,33,129 Without a structural basis for competency assessment across the continuum, the difficulties of evaluating competency-based assessment efforts will remain. We suggest that for CBME to be successfully applied to practice, strategies are needed to consistently integrate novel approaches to learning (direct observation, simulation, audit and feedback, multisource feedback, educational outreach visits, etc). These new learning activities (which are both assessment activities and educational interventions) can provide physicians with ongoing assessment and feedback leading to clinical behavioral change.31 Unlike residents, physicians after the completion of training often do not have a formal system that is designed to provide personal support for learning and improvement. Recent recommendations for maintenance of certification systems to evolve systems of CPD to support learning and continuous improvement of practice are strategic attempts to address this specific concern.130,131 Traditional continuing medical education programs, the primary formal educational support structure for practicing physicians, is limited in its ability to change performance and patient outcomes. Practicing physicians need access to trusted practice data with the opportunity to review their data with a peer, coach, or mentor to enable identification of (and minimize resistance to) needed practice changes.132-135 Physicians will need access to resources and an infrastructure to support and motivate them to sustain practice change.136 While CBME is primarily directed toward individuals, improving health outcomes will likely require a team-based strategy along with educational and clinical care systems in order to provide the conditions enabling continuous improvement.

Strengths and Limitations

This is the first review study to examine the nature and trends of competency-based assessment methods in family medicine. Its strengths include the assembly of key content and methodological experts from diverse backgrounds.35-39 We acknowledge several limitations of this review. Our review was based on a small set of articles, which speaks to the limited number of publications in the field. In addition, the shortage of CPD articles restricted our ability to compare between residency and CPD studies, so there may be interesting differences or similarities among the two stages of the learning continuum. Lastly, although we sought to search multiple databases including the gray literature, the scope of the search was limited to articles in English published in Canada and the United States, which have similar residency programs in family medicine.137


In this narrative review, we attempted to inform future approaches and research by analyzing and synthesizing the findings of publications that described CBME available assessment tools in family medicine residency and CPD. Our analysis shows: (1) a very small body of published work currently exists around competency-based assessments in family medicine; (2) a lack of studies on assessment methods among practicing physicians (a gap in ongoing assessment in clinical practice); and (3) common themes that may offer insights into how current assessment practices might be improved to ensure alignment with modern conceptions of health professional education for the ultimate goal of improved health care. We recommend that those involved in family medicine education should strive to systematically evaluate and publish their CBME activities, including assessment-related content and evaluations. In addition, one of the important avenues for future research should illustrate residents’ and faculty’s perception of how new approaches to curriculum design and evaluation are impacting their learning/teaching. Finally, we suggest that in building a coherent and integrated system of assessment, evaluation of the key contextual factors across the continuum of education to practice is of increasing importance in the field of CPD.


  1. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-645. doi:10.3109/0142159X.2010.501190
  2. Norcini JJ, Holmboe ES, Hawkins RE. Evaluation challenges in the era of outcomes-based education. In: Holmboe ES, Hawkins RE, eds. Practical Guide to the Evaluation of Clinical Competence. Philadelphia, PA: Mosby/Elsevier; 2008:1-9.
  3. Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR; For The International CBME Collaborators. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676-682. doi:10.3109/0142159X.2010.500704
  4. Lockyer J, Carraccio C, Chan M-K, et al; ICBME Collaborators. Core principles of assessment in competency-based medical education. Med Teach. 2017;39(6):609-616. doi:10.1080/0142159X.2017.1315082
  5. Ross S, Poth CN, Donoff M, et al. Competency-based achievement system: using formative feedback to teach and assess family medicine residents’ skills. Can Fam Physician. 2011;57(9):e323-e330.
  6. Eva KW, Bordage G, Campbell C, et al. Towards a program of assessment for health professionals: from training into practice. Adv Health Sci Educ Theory Pract. 2016;21(4):897-913. doi:10.1007/s10459-015-9653-6
  7. Campbell C, Silver I, Sherbino J, Cate OT, Holmboe ES. Competency-based continuing professional development. Med Teach. 2010;32(8):657-662. doi:10.3109/0142159X.2010.500708
  8. Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010;32(8):631-637. doi:10.3109/0142159X.2010.500898
  9. Kruse J. Social accountability across the continuum of medical education: a call for common missions for professional, accreditation, certification, and licensure organizations. Fam Med. 2013;45(3):208-211.
  10. Iobst WF, Holmboe ES. Building the continuum of competency-based medical education. Perspect Med Educ. 2015;4(4):165-167. doi:10.1007/s40037-015-0191-y
  11. Harris P, Bhanji F, Topps M, et al; ICBME Collaborators. Evolving concepts of assessment in a competency-based world. Med Teach. 2017;39(6):603-608. doi:10.1080/0142159X.2017.1315071
  12. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007;29(7):648-654. doi:10.1080/01421590701392903
  13. Whitehead CR, Martin D, Fernandez N, et al. Integration of CanMEDS expectations and outcomes. Members of the FMEC PG consortium. 2011. https://www.afmc.ca/pdf/fmec/15_Whitehead_CanMEDS.pdf. Accessed September 17, 2018.
  14. Sherbino J, Kulasegaram K, Worster A, Norman GR. The reliability of encounter cards to assess the CanMEDS roles. Adv Health Sci Educ Theory Pract. 2013;18(5):987-996. doi:10.1007/s10459-012-9440-6
  15. Hawkins RE, Welcher CM, Holmboe ES, et al. Implementation of competency-based medical education: are we addressing the concerns and challenges? Med Educ. 2015;49(11):1086-1102. doi:10.1111/medu.12831
  16. Tannenbaum D, Kerr J, Konkin J, et al. Triple C competency-based curriculum. Report of the Working Group on postgraduate curriculum review – Part 1. Mississauga, ON: College of Family Physicians of Canada; 2011, www.cfpc.ca/uploadedFiles/Education/_PDFs/WGCR_TripleC_Report_English_Final_18Mar11.pdf. Accessed August 24, 2018.
  17. Iglar K, Whitehead C, Takahashi SG. Competency-based education in family medicine. Med Teach. 2013;35(2):115-119. doi:10.3109/0142159X.2012.733837
  18. McEwen LA, Griffiths J, Schultz K. Developing and successfully implementing a competency-based portfolio assessment system in a postgraduate family medicine residency program. Acad Med. 2015;90(11):1515-1526. doi:10.1097/ACM.0000000000000754
  19. Schultz K, Griffiths J. Implementing competency-based medical education in a postgraduate family medicine residency training program: a stepwise approach, facilitating factors, and processes or steps that would have been helpful. Acad Med. 2016;91(5):685-689. doi:10.1097/ACM.0000000000001066
  20. Edwards FD, Frey KA. The future of residency education: implementing a competency-based educational model. Fam Med. 2007;39(2):116-125.
  21. Tudiver F, Rose D, Banks B, Pfortmiller D. Reliability and validity testing of an evidence-based medicine OSCE station. Fam Med. 2009;41(2):89-91.
  22. Thomas RE, Kreptul D. Systematic review of evidence-based medicine tests for family physician residents. Fam Med. 2015;47(2):101-117.
  23. Jansen KL, Rosenbaum ME. The state of communication education in family medicine residencies. Fam Med. 2016;48(6):445-451.
  24. Campbell C, Hendry P, Delva D, Danilovich N, Kitto S. Implementing competency-based medical education in family medicine: A scoping review on residency programs and family practices in Canada and the USA. Fam Med. 2020;52(4):246-254. doi:10.22454/FamMed.2020.594402
  25. Boucher A, Frank JR, Melle V, Oandasan I, Touchie C. Competency-based medical education in Canada: a white paper commissioned by the AFMC board of directors; 2017.
  26. Ellaway RH, Palacios Mackay M, Lee S, et al. The impact of a national competency-based medical education initiative in family medicine. Acad Med. 2018;93(12):1850-1857. doi:10.1097/ACM.0000000000002387
  27. Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33(3):206-214. doi:10.3109/0142159X.2011.551559
  28. Humphrey-Murto S, Wood TJ, Ross S, et al. Assessment pearls for competency-based medical education. J Grad Med Educ. 2017;9(6):688-691. doi:10.4300/JGME-D-17-00365.1
  29. Curran VR, Deacon D, Schulz H, et al. Evaluation of the characteristics of a workplace assessment form to assess entrustable professional activities (EPAs) in an undergraduate surgery core clerkship. J Surg Educ. 2018;75(5):1211-1222. doi:10.1016/j.jsurg.2018.02.013
  30. Gruppen LD, Ten Cate O, Lingard LA, Teunissen PW, Kogan JR. Enhanced requirements for assessment in a competency-based, time-variable medical education system. Acad Med. 2018;93(3S Competency-Based, Time-Variable Education in the Health Professions):S17-S21. doi:10.1097/ACM.0000000000002066
  31. Lockyer J, Bursey F, Richardson D, Frank JR, Snell L, Campbell C; ICBME Collaborators. Competency-based medical education and continuing professional development: A conceptualization for change. Med Teach. 2017;39(6):617-622. doi:10.1080/0142159X.2017.1315064
  32. Nousiainen MT, Caverzagie KJ, Ferguson PC, Frank JR; ICBME Collaborators. Implementing competency-based medical education: what changes in curricular structure and processes are needed? Med Teach. 2017;39(6):594-598. doi:10.1080/0142159X.2017.1315077
  33. Kitto S, Danilovich N, Delva D, et al. Uncharted territory: knowledge translation of competency-based continuing professional development in family medicine. Can Fam Physician. 2018;64(4):250-253.
  34. Dixon-Woods M, Agarwal S, Jones D, Young B, Sutton A. Synthesising qualitative and quantitative evidence: a review of possible methods. J Health Serv Res Policy. 2005;10(1):45-53. doi:10.1177/135581960501000110
  35. Green BN, Johnson CD, Adams A. Writing narrative literature reviews for peer-reviewed journals: secrets of the trade. J Chiropr Med. 2006;5(3):101-117. doi:10.1016/S0899-3467(07)60142-6
  36. Shaw L, Campbell H, Jacobs K, Prodinger B. Twenty years of assessment in WORK: a narrative review. Work. 2010;35(3):257-267. doi:10.3233/WOR-2010-0989
  37. Ferrari R. Writing narrative style literature reviews. Med Writ. 2015;24(4):230-235. doi:10.1179/2047480615Z.000000000329
  38. Greenhalgh T, Raftery J, Hanney S, Glover M. Research impact: a narrative review. BMC Med. 2016;14(1):78. doi:10.1186/s12916-016-0620-8
  39. Greenhalgh T, Thorne S, Malterud K. Time to challenge the spurious hierarchy of systematic over narrative reviews? Eur J Clin Invest. 2018;48(6):e12931. doi:10.1111/eci.12931
  40. Baethge C, Goldbeck-Wood S, Mertens S. SANRA-a scale for the quality assessment of narrative review articles. Res Integr Peer Rev. 2019;4(1):5. doi:10.1186/s41073-019-0064-8
  41. Slavin RE. Best evidence synthesis: an intelligent alternative to meta-analysis. J Clin Epidemiol. 1995;48(1):9-18. doi:10.1016/0895-4356(94)00097-A
  42. Gasparyan AY, Ayvazyan L, Blackmore H, Kitas GD. Writing a narrative biomedical review: considerations for authors, peer reviewers, and editors. Rheumatol Int. 2011;31(11):1409-1417. doi:10.1007/s00296-011-1999-3
  43. Schaepe C, Bergjan M. Educational interventions in peritoneal dialysis: a narrative review of the literature. Int J Nurs Stud. 2015;52(4):882-898. doi:10.1016/j.ijnurstu.2014.12.009
  44. Hayes J, Ford T, Rafeeque H, Russell G. Clinical practice guidelines for diagnosis of autism spectrum disorder in adults and children in the UK: a narrative review. BMC Psychiatry. 2018;18(1):222. doi:10.1186/s12888-018-1800-1
  45. Popay J, Roberts H, Sowden A, et al. Guidance on the Conduct of Narrative Synthesis in Systematic Reviews: A Product from the ESRC Methods Programme. 2006. https://www.researchgate.net/profile/Lisa_Arai/publication/242311393_Guidance_on_the_conduct_of_narrative_synthesis_in_systematic_reviews_a_comparison_of_guidance-led_narrative_synthesis_versus_meta-analysis/links/5532159f0cf2f2a588ad67fd.pdf. Accessed June 20, 2019.
  46. Polgar S, Thomas SA. Introduction to research in the health sciences. 3rd ed. Melbourne: Churchill Livingstone; 1995:343-355.
  47. Moher D, Liberati A, Tetzlaff J, Altman DG; PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097. doi:10.1371/journal.pmed.1000097
  48. US National Library of Medicine. Publication characteristics (publication types) with scope notes: 2016 MeSH edition. Bethesda, MD: National Library of Medicine; 2016.
  49. Barr H, Freeth D, Hammick M, Koppel I, Reeves S. Evaluations of interprofessional education: a United Kingdom review of health and social care. Fareham, UK: Centre for the Advancement of Interprofessional Education; 2000. https://www.caipe.org/resources/publications/barr-h-freethd-hammick-m-koppel-i-reeves-s-2000-evaluations-of-interprofessional-education. Accessed May 17, 2019.
  50. Moore DE Jr, Green JS, Gallis HA. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29(1):1-15. doi:10.1002/chp.20001
  51. Watson A. Strategies for the assessment of competence. The Vocational Aspect of Education. 1994;46(2):155-165. doi:10.1080/0305787940460205
  52. Miller A, Archer J. Impact of workplace based assessment on doctors’ education and performance: a systematic review. BMJ. 2010;341(sep24 1):c5064. doi:10.1136/bmj.c5064
  53. Gilligan C, James EL, Snow P, et al. Interventions for improving medical students’ interpersonal communication in medical consultations (Protocol). Cochrane Database Syst Rev. 2016;11:CD012418. doi:10.1002/14651858.CD012418
  54. Tricco AC, Ashoor HM, Cardoso R, et al. Sustainability of knowledge translation interventions in healthcare decision-making: a scoping review. Implement Sci. 2016;11(1):55. doi:10.1186/s13012-016-0421-7
  55. Page C, Reid A, Coe CL, et al. Piloting the Mobile Medical Milestones Application (M3App©): A Multi-Institution Evaluation. Fam Med. 2017;49(1):35-41.
  56. Accreditation Council for Graduate Medical Education. Outcome Project: General competencies. 1999. http://www.acgme.org/outcome/comp/compFull.asp. Accessed September 20, 2018.
  57. Accreditation Council for Graduate Medical Education, American Board of Medical Specialties. Toolbox of assessment methods. 2000. http://njms.rutgers.edu/culweb/medical/documents/ToolboxofAssessmentMethod.pdf. Accessed, September 20, 2018.
  58. Frank JR, ed. The Canmeds 2005 physician competency framework. Better standards. Better physicians. Better care. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2005, http://rcpsc.medical.org/canmeds/index.php. Accessed January 12, 2019.
  59. Tannenbaum D, Konkin J, Parsons I, et al. CanMEDS - Family Medicine. Mississauga, ON: College of Family Physicians of Canada; 2009, https://www.cfpc.ca/uploadedFiles/Education/CanMeds%20FM%20Eng.pdf. Accessed January 15, 2019.
  60. Laughlin T, Wetmore S, Allen T, et al. Defining competency-based evaluation objectives in family medicine: communication skills. Can Fam Physician. 2012;58(4):e217-e224.
  61. Wendling AL. Assessing resident competency in an outpatient setting. Fam Med. 2004;36(3):178-184.
  62. Chesser A, Reyes J, Woods NK, Williams K, Kraft R. Reliability in patient-centered observations of family physicians. Fam Med. 2013;45(6):428-432.
  63. Ross V, Mauksch L, Huntington J, Beard JM. Interdisciplinary direct observation: impact on precepting, residents, and faculty. Fam Med. 2012;44(5):318-324.
  64. Donoff MG. Field notes: assisting achievement and documenting competence. Can Fam Physician 2009;55:1260–62, e100.
  65. Reddy ST, Endo J, Gupta S, Tekian A, Park YS. A case for caution: chart-stimulated recall. J Grad Med Educ. 2015;7(4):531-535. doi:10.4300/JGME-D-15-00011.1
  66. Leung F-H, Herold J, Iglar K. Family medicine mandatory assessment of progress results of a pilot administration of a family medicine competency-based in-training examination. Can Fam Physician. 2016;62(5):e263-e7.
  67. Curran VR, Butler R, Duke P, et al. Effectiveness of a simulated clinical examination in the assessment of the clinical competencies of entry-level trainees in a family medicine residency programme. Assess Eval High Educ. 2012;37(1):99-112. doi:10.1080/02602938.2010.515009
  68. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542-547. doi:10.1097/ACM.0b013e31805559c7
  69. Ross S, Poth C, Donoff M, Humphries P. Monitoring, adapting, and evaluating a competency-based assessment framework in medical education through participatory action research. AJER. 2009;55(4):549-552.
  70. Tochel C, Haig A, Hesketh A, et al. The effectiveness of portfolios for post-graduate assessment and education: BEME Guide No 12. Med Teach. 2009;31(4):299-318. doi:10.1080/01421590902883056
  71. Holmboe ES, Davis MH, Carraccio C. Portfolios. In: Holmboe ES, RE Hawkins RE, eds. Practical Guide to the Evaluation of Clinical Competence. Philadelphia: Mosby; 2008.
  72. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361-367. doi:10.1097/00001888-200205000-00003
  73. Price DW, Biernacki H, Nora LM. Can maintenance of certification work? Associations of MOC and improvements in physicians’ knowledge and practice. Acad Med. 2018;93(12):1872-1881. doi:10.1097/ACM.0000000000002338
  74. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9)(suppl):S63-S67. doi:10.1097/00001888-199009000-00045
  75. Kirkpatrick DL, Kirkpatrick JD. Transferring learning to behavior: Using the four levels to improve performance. San Francisco, CA: Berrett-Koehler Publishers; 2005.
  76. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217-226. doi:10.1097/MLR.0b013e3182408812
  77. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139-150. doi:10.1186/1748-5908-8-139
  78. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: a synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). 2005. http://ctndisseminationlibrary.org/PDF/nirnmonograph.pdf. Accessed April 3, 2019.
  79. Tremblay K, Lalancette D, Roseveare D. Assessment of higher education learning outcomes. Feasibility sstudy report volume 1 – design and implementation. Paris: Organisation for EconomicCo-operation and Development; 2012. http://www.oecd.org/education/skills-beyond-school/AHELOFSReportVolume1.pdf. Accessed April 3, 2019.
  80. Fixsen DL, Naoom SF, Blase KA, Wallace F. Implementation: the missing link between research and practice. APSAC Adv. 2007;19(1 & 2):4-11.
  81. Blasé KA, Fixsen DL, Duda MA, Metz AJ, Naoom SF, Van Dyke MK. Implementation Challenges and Successes: Some Big Ideas. Presentation at Blueprints for Violence Prevention Conference. San Antonio, TX; 2010. https://www.blueprintsconference.com/2010/presentations/t1a_kb.pdf. Accessed April 5, 2019.
  82. Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65-76. doi:10.1007/s10488-010-0319-7
  83. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77-101. doi:10.1191/1478088706qp063oa
  84. Jones K. Mission drift in qualitative research, or moving toward a systematic review of qualitative studies, moving back to a more systematic narrative review. Qual Rep. 2004;9(1):95-112.
  85. Freeman M. History, narrative, and life-span developmental knowledge. Hum Development. 1984;27(1):1-19. doi:10.1159/000272899
  86. Nowell LS, Norris JM, White DE, Moules NJ. Thematic analysis: striving to meet the trustworthiness criteria. Int J Qual Methods. 2017;16(1):1-13. doi:10.1177/1609406917733847
  87. Oandasan I, Saucier D, eds. Triple C Competency-based Curriculum Report – Part 2: Advancing Implementation. Mississauga, ON: College of Family Physicians of Canada; 2013, https://www.cfpc.ca/uploadedFiles/Education/_PDFs/TripleC_Report_pt2.pdf. Accessed January 25, 2019.
  88. Torbeck L, Wrightson AS. A method for defining competency-based promotion criteria for family medicine residents. Acad Med. 2005;80(9):832-839. doi:10.1097/00001888-200509000-00010
  89. Schipper S, Ross S. Structured teaching and assessment: a new chart-stimulated recall worksheet for family medicine residents. Can Fam Physician. 2010;56(9):958-959, e352-e354.
  90. Schultz K, Griffiths J, Lacasse M. The application of entrustable professional activities to inform competency decisions in a family medicine residency program. Acad Med. 2015;90(7):888-897. doi:10.1097/ACM.0000000000000671
  91. Laughlin T, Brennan A, Brailovsky C. Effect of field notes on confidence and perceived competence: survey of faculty and residents. Can Fam Physician. 2012;58(6):e352-e356.
  92. Baglia J, Foster E, Dostal J, Keister D, Biery N, Larson D. Generating developmentally appropriate competency assessment at a family medicine residency. Fam Med. 2011;43(2):90-98.
  93. Kligler B, Koithan M, Maizes V, et al. Competency-based evaluation tools for integrative medicine training in family medicine residency: a pilot study. BMC Med Educ. 2007;7(1):7. doi:10.1186/1472-6920-7-7
  94. Singh R, Naughton B, Taylor JS, et al. A comprehensive collaborative patient safety residency curriculum to address the ACGME core competencies. Med Educ. 2005;39(12):1195-1204. doi:10.1111/j.1365-2929.2005.02333.x
  95. Kolva DE, Barzee KA, Morley CP. Practice management residency curricula: a systematic literature review. Fam Med. 2009;41(6):411-419.
  96. Lacasse M, Douville F, Desrosiers É, Côté L, Turcotte S, Légaré F. Using field notes to evaluate competencies in family medicine training: a study of predictors of intention. Can Med Educ J. 2013;4(1):e16-e25. doi:10.36834/cmej.36600
  97. Shaughnessy AF, Sparks J, Cohen-Osher M, Goodell KH, Sawin GL, Gravel J Jr. Entrustable professional activities in family medicine. J Grad Med Educ. 2013;5(1):112-118. doi:10.4300/JGME-D-12-00034.1
  98. Wetmore S, Laughlin T, Lawrence K, et al. Defining competency-based evaluation objectives in family medicine: procedure skills. Can Fam Physician. 2012;58(7):775-780.
  99. Page C, Reid A, Brown MM, Baker HM, Coe C, Myerholtz L. Content analysis of family medicine resident peer observations. Fam Med. 2020;52(1):43-47. doi:10.22454/FamMed.2020.855292
  100. Holmboe ES, Yamazaki K, Nasca TJ, Hamstra SJ. Using longitudinal milestones data and learning analytics to facilitate the professional development of residents: early lessons from three specialties. Acad Med. 2020;95(1):97-103. doi:10.1097/ACM.0000000000002899
  101. Ross S, Binczyk NM, Hamza DM, et al. Association of a competency-based assessment system with identification of and support for medical residents in difficulty. JAMA Netw Open. 2018;1(7):e184581. doi:10.1001/jamanetworkopen.2018.4581
  102. Mainous AG III, Fang B, Peterson LE. Competency assessment in family medicine residency: Observations, knowledge-based examinations, and advancement. J Grad Med Educ. 2017;9(6):730-734. doi:10.4300/JGME-D-17-00212.1
  103. Waller E, Eiff MP, Dexter E, et al. Impact of residency training redesign on residents’ clinical knowledge. Fam Med. 2017;49(9):693-698.
  104. Loeppky C, Babenko O, Ross S. Examining gender bias in the feedback shared with family medicine residents. Educ Prim Care. 2017;28(6):319-324. doi:10.1080/14739879.2017.1362665
  105. Barlow PB, Thoma KDC, Ferguson KJ. The impact of using mean versus mode when assessing resident competency. J Grad Med Educ. 2017;9(3):302-309. doi:10.4300/JGME-D-16-00571.1
  106. Lerner BS, Kalish V, Ledford CJW. Exploring residents’ skills in diagnosing dementia: the unexpected dissonance between ability and confidence. Fam Med. 2017;49(6):460-463.
  107. Magee SR, Eidson-Ton WS, Leeman L, et al. Family medicine maternity care call to action: moving toward national standards for training and competency assessment. Fam Med. 2017;49(3):211-217.
  108. Puffer JC, O’Neill TR, Stelter K. Looking for Trouble. Fam Med. 2016;48(9):743-744.
  109. Jarrett JB, Antoun J, Hasnain M. Entrustable professional activity utilization: A CERA study of family medicine residency program directors. Fam Med. 2019;51(6):471-476. doi:10.22454/FamMed.2019.876961
  110. Ross S, Poth C, Donoff M, et al. Involving users in the refinement of the competency-based achievement system (CBAS), an innovative approach to competency-based assessment. Med Teach. 2012;34(2):e413-e7. doi:10.3109/0142159X.2012.644828
  111. Saultz J. Looking for trouble. Fam Med. 2016;48(6):425-426.
  112. Glasgow NJ, Wells R, Butler J, Gear A. The effectiveness of competency-based education in equipping primary health care workers to manage chronic disease in Australian general practice settings. Med J Aust. 2008;188(S8)(suppl):S92-S96. doi:10.5694/j.1326-5377.2008.tb01755.x
  113. Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-based Medical Education Collaborators. Stockley, Sherbino J, on behalf of the International Competency-based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002-1009. doi:10.1097/ACM.0000000000002743
  114. Fernandez N, Dory V, Ste-Marie LG, Chaput M, Charlin B, Boucher A. Varying conceptions of competence: an analysis of how health sciences educators define competence. Med Educ. 2012;46(4):357-365. doi:10.1111/j.1365-2923.2011.04183.x
  115. Lurie SJ. History and practice of competency-based assessment. Med Educ. 2012;46(1):49-57. doi:10.1111/j.1365-2923.2011.04142.x
  116. Lochnan H, Kitto S, Danilovich N, et al. Conceptualization of competency-based medical education terminology in family medicine postgraduate medical education and continuing professional development: a scoping review. Acad Med. 2020;95(7):1106-1119; Epub ahead of print. doi:10.1097/ACM.0000000000003178
  117. Wallace S, May SA. Assessing and enhancing quality through outcomes-based continuing professional development (CPD): a review of current practice. Vet Rec. 2016;179(20):515-520. doi:10.1136/vr.103862
  118. Pelgrim EAM, Kramer AWM, Mokkink HGA, Van der Vleuten CPM. Quality of written narrative feedback and reflection in a modified mini-clinical evaluation exercise: an observational study. BMC Med Educ. 2012;12(1):97. doi:10.1186/1472-6920-12-97
  119. Lucey CR, Thibault GE, Ten Cate O. Competency-based, time-variable education in the health professions: crossroads. Acad Med. 2018;93(3S Competency-Based, Time-Variable Education in the Health Professions):S1-S5. doi:10.1097/ACM.0000000000002080
  120. Eiff MP, Waller E, Dostal J, et al. Faculty development needs in residency re-design: A report from the Preparing Personal Physicians for Practice (P4). Fam Med. 2012;44:387-395.
  121. Carney PA, Eiff MP, Green LA, et al. Transforming primary care residency training: a collaborative faculty development initiative among family medicine, internal medicine, and pediatric residencies. Acad Med. 2015;90(8):1054-1060. doi:10.1097/ACM.0000000000000701
  122. Malik MU, Diaz Voss Varela DA, Stewart CM, et al. Barriers to implementing the ACGME Outcome Project: a systematic review of program director surveys. J Grad Med Educ. 2012;4(4):425-433. doi:10.4300/JGME-D-11-00222.1
  123. Bok HG, Teunissen PW, Spruijt A, et al. Clarifying students’ feedback-seeking behaviour in clinical clerkships. Med Educ. 2013;47(3):282-291. doi:10.1111/medu.12054
  124. Carraccio CL, Englander R. From Flexner to competencies: reflections on a decade and the journey ahead. Acad Med. 2013;88(8):1067-1073. doi:10.1097/ACM.0b013e318299396f
  125. Könings KD, Brand-Gruwel S, van Merrienboer JJG. Teachers’ perspectives on innovations: implications for educational design. Teach Teach Educ. 2007;23(6):985-997. doi:10.1016/j.tate.2006.06.004
  126. Bok HGJ, Teunissen PW, Favier RP, et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ. 2013;13(1):123. doi:10.1186/1472-6920-13-123
  127. Könings KD, Seidel T, van Merrienboer JJG. Participatory design of learning environments: integrating perspectives of students, teachers and designers. Instr Sci. 2014;42(1):1-9. doi:10.1007/s11251-013-9305-2
  128. Cate OT, Carraccio C. Envisioning a true continuum of competency-based medical education, training, and practice. Acad Med. 2019;94(9):1283-1288. doi:10.1097/ACM.0000000000002687
  129. Whitehurst KE, Carraway M, Riddick A, Basnight LL, Garrison HG. Making the learning continuum a reality: the critical role of a graduate medical education–continuing medical education partnership. J Contin Educ Health Prof. 2019;39(4):279-284. doi:10.1097/CEH.0000000000000271
  130. Commission on the Vision for the Future of Continuing Certification. Continuing Board Certification: Vision for the Future Commission. Final Report. 2019. https://visioninitiative.org/wpcontent/uploads/2019/02/Commission_Final_Report_20190212.pdf. Accessed October 14, 2019.
  131. The Future of Medical Education in Canada. Supporting learning and continuous practice improvement for physicians in Canada: A new way forward. 2019. https://www.fmec-cpd.ca/wp-content/uploads/2019/08/FMEC-CPD_Synthesized_EN_WEB.pdf. Accessed October 14, 2019.
  132. Sargeant J, Armson H, Chesluk B, et al. The processes and dimensions of informed self-assessment: a conceptual model. Acad Med. 2010;85(7):1212-1220. doi:10.1097/ACM.0b013e3181d85a4e
  133. Mann K, van der Vleuten C, Eva K, et al. Tensions in informed self-assessment: how the desire for feedback and reticence to collect and use it can conflict. Acad Med. 2011;86(9):1120-1127. doi:10.1097/ACM.0b013e318226abdd
  134. Sargeant J, Lockyer J, Mann K, et al. Facilitated reflective performance feedback: developing an evidence- and theory-based model that builds relationship, explores reactions and content, and coaches for performance change (R2C2). Acad Med. 2015;90(12):1698-1706. doi:10.1097/ACM.0000000000000809
  135. Sargeant J. Future research in feedback: how to use feedback and coaching conversations in a way that supports development of the individual as a self-directed learner and resilient professional. Acad Med. 2019;94(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 58th Annual Research in Medical Education Sessions):S9-S10. doi:10.1097/ACM.0000000000002911
  136. Price D. Maintenance of Certification, Continuing Professional Development, and Performance Improvement. In: Rayburn W, Turco M, Davis DA, eds. Continuing Professional Development in Medicine and Health Care: Better Education, Better Patient Outcomes. Philadelphia, PA: Wolters Kluwer; 2017.
  137. O’Neill TR, Peabody MR, Puffer JC. Differences in Canadian and US medical student preparation for family medicine. Fam Med. 2016;48(10):770-774.

Lead Author

Natalia Danilovich, PhD, MD, MSc

Affiliations: Office of Continuing Professional Development and Department of Innovation in Medical Education, University of Ottawa, Ottawa, Ontario, Canada


Simon Kitto, PhD - Office of Continuing Professional Development and Department of Innovation in Medical Education, University of Ottawa, Ottawa, Ontario, Canada

David W. Price, MD - University of Colorado Anschutz School of Medicine, Aurora, CO

Craig Campbell, MD - Office of Specialty Education at the Royal College of Physicians and Surgeons of Canada, University of Ottawa, Ottawa, ON, Canada

Amanda Hodgson, MLIS - University of Ottawa, Ottawa, Ontario, Canada

Paul Hendry, MD, MSc - Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada

Corresponding Author

Paul Hendry, MD, MSc

Correspondence: 725 Parkdale Avenue, Loeb Building, Room WM 158f, Ottawa, ON, K1Y 4E9, Canada. 613-798-5555, ext. 17628.

Email: hendry@ottawaheart.ca

Fetching other articles...


Loading metrics from Mendeley...

Loading the comment form...

Submitting your comment...

There are no comments for this article.

Downloads & Info



Searching for articles...