ORIGINAL ARTICLES

Redesigning Residency Training: Summary Findings From the Preparing the Personal Physician for Practice (P4) Project

Patricia A. Carney, PhD | M. Patrice Eiff, MD | Elaine Waller | Samuel M. Jones, MD | Larry A. Green, MD

Fam Med. 2018;50(7):503-517.

DOI: 10.22454/FamMed.2018.829131

Return to Issue

Abstract

Background and Objectives: The Preparing the Personal Physician for Practice (P4) project (2007 to 2014) involved a comparative case study of experiments conducted by 14 selected family medicine programs designed to evaluate new models of residency education that aligned with the patient-centered medical home (PCMH). Changes in length, structure, content, and location of training were studied.

Methods: We conducted both a critical review of P4 published Evaluation Center and site-specific papers and a qualitative narrative analysis of process reports compiled throughout the project. We mapped key findings from P4 to results obtained from a survey of program directors on their top 10 “need to know” areas in family medicine education.

Results: Collectively, 830 unique residents took part in P4, which explored 80 hypotheses regarding 44 innovations. To date, 39 papers have resulted from P4 work, with the P4 Evaluation Center producing 17 manuscripts and faculty at individual sites producing 22 manuscripts. P4 investigators delivered 21 presentations and faculty from P4 participating programs delivered 133 presentations at national meetings. For brevity, we present findings derived from the analyses of project findings according to the following categories: (1) how residency training aligned with PCMH; (2) educational redesign and assessment; (3) methods of financing new residency experiences; (4) length of training; (5) scope of practice; and (6) setting standards for conducting multisite educational research.

Conclusions: The P4 project was a successful model for multisite graduate medical education research. Insights gained from the P4 project could help family medicine educators with future residency program redesign.


In 2004, the Future of Family Medicine Report called for changes in family medicine training and practice aimed at improving the health of the American public.1 The Preparing the Personal Physician for Practice (P4) initiative was subsequently undertaken to catalyze innovation in family medicine residency training.2 Green et al published an overview article about the project in 2007 that identified 10 “need to know” categories derived from a survey of members of the Association of Family Medicine Residency Directors (AFMRD).2 These included: (1) how the residency experience can better align with the Patient-Centered Medical Home (PCMH); (2) how residents can learn to work effectively in teams; (3) how residents can learn to use technology to measure and improve health care quality and patient outcomes; (4) what evidence can support particular experiences determined to be effective in producing skilled personal physicians; (5) what teaching methods appear most effective; (6) what educational outcome measures are meaningful; (7) how best to assess and ensure competency; (8) how to incorporate evidence-based medicine (EBM) into daily clinical practice; (9) how to finance new residency experiences; and (10) how to help graduates adapt to emerging health care system changes. In addition, as part of the P4 implementation and evaluation processes, other areas of study emerged, especially related to the challenges of implementation, which could help residency faculty with their redesign efforts.

The purpose of this study is to outline key findings from P4, derived both from published papers that address the 10 program directors’ “need to know” categories, and the evaluation team’s longitudinal observations. These findings could guide other residency educators who are leading change in graduate medical education.

Methods

Background

The 14 P4 programs (Table 1) were selected from 84 initial applications submitted in response to a call for proposals provided to all family medicine residencies followed by a blinded peer-review process for 44 full applications that were invited by the internal review committee. Selection criteria included the novelty of proposed innovations, ability to transform training and practices in alignment with the PCMH features, evaluation capacity, and sustainability and financial viability of proposed innovations. The programs chosen were diverse in size, location, setting (urban and rural), and affiliation (community-based and university-based). The P4 programs formed a comparative case study of experiments involving innovations in residency education that aligned with the PCMH and that included changes in the length, structure, content, and location of training and expanded measurements of competency.

The P4 project, which ran from 2007 to 2014, was overseen by a steering committee of educators and practicing family physicians. Five in-person learning collaboratives occurred during the project, and site visits that included tailored faculty development sessions were conducted at all sites in the beginning of the project. All participating programs were required to undertake core data collection activities, which included annual surveys completed by residents, program directors, medical directors, and/or clinic staff at continuity clinics. In addition, residency graduates were surveyed 18 months after graduation. All programs and the Evaluation Center, located at Oregon Health and Science University (OHSU), underwent IRB review and were granted exemptions or approvals, based on specific evaluation activities.

Analytic Methods

We conducted a critical review3 of P4 published papers that were produced by the Evaluation Center and by participating P4 residency programs. We mapped key findings from P4 to results obtained from a survey of program directors on what they believed the top 10 “need to know” areas were in family medicine education.2

To date, 39 papers have resulted from P4. Investigators at the P4 Evaluation Center produced 17 publications,4-20 and faculty at P4 sites produced 22.21-42 The Evaluation Center investigators delivered 21 presentations, and faculty from participating programs delivered 133 presentations at national meetings, though we did not include these in our analyses. We also excluded commentaries and letters to the editor from analyses and selected only papers that provided specific measurable evaluations that were either process- or outcomes-based. With these exclusions, we present key findings from 15 (88%) of the core evaluation papers and 20 (91%) of the site-specific papers in this critical review. A detailed summary of findings from these publications according to all 10 “need to know” areas is provided in Table 2 for Evaluation Center papers, and in Table 3 for program-specific papers.

To supplement what was published, we conducted a qualitative narrative analysis43 of progress reports compiled throughout the project, which involved two members of the evaluation team (MPE and EW) independently reviewing reports for emergent themes using narrative analysis techniques.44 These reports provided information from site visits and subsequent program communication, such as telephone conferences and email exchanges, which allowed us to capture the ongoing progress of each innovation implementation and evaluation beyond what was reported in published papers. The documents were organized using a standard report template to ensure consistency and were updated regularly to record process status and challenges. Consensus meetings were used identify emergent themes and exemplars derived from progress reports.

For brevity and to reduce overlap, we present key P4 findings according to six categories that the critical review and the narrative analyses could best address: (1) how residency training aligned with PCMH, (2) educational redesign and assessment, (3) methods of financing new residency experiences, (4) length of training, (5) scope of practice, and (6) setting standards for conducting multisite educational research.

Results

How Residency Training Aligned With the PCMH

Key Points:

  • Simultaneously improving both continuity clinic and resident education operations to incorporate PCMH elements is disruptive to the traditional family medicine clinic model and takes time.
  • Team-based care plays a pivotal role in family medicine residency training.

The P4 programs used two approaches to redesign residency training toward the PCMH: (1) focusing on changing the practice then immersing residents into the new environment, or (2) designing specific educational methods and experiences that aligned with the new environment (eg, training residents to lead group visits). Both approaches had success and challenges.

Some PCMH features, such as the electronic health record (EHR), secure remote access, chronic disease registries, and open-access scheduling, were already in place in 2007.5 Low areas of implementation included email communication with patients, population-based quality improvement (QI), preventive service registries, and practice-based research using the EHR. Few differences were noted in PCMH features between community-based and university-based programs.5 By 2011, P4 programs were primarily successful in implementing EHR-based PCMH features in their continuity clinics.17 Notable increases were in email communication with patients (33% to 67%), preventive services registries (63% to 82%), and population-based QI (46% to 76%). Team-based care was the only process of care feature associated with a significant upward trend, with a near doubling of this feature over the 5 years of P4 data collection.

Individual P4 programs provided more in-depth information on various aspects of PCMH implementation. A low-overhead and high-tech, high-touch approach to patient care was a promising model implemented in one rural community.27 One program conducted focus groups to examine the impact of PCMH redesign on residents, faculty, and clinic staff.26 Factors associated with success included involving staff in developing solutions, enhancing responsibility, and increasing team cohesion over time. Residents involved in the redesign effort felt that it enhanced their practice experience. Challenges to the process included obtaining sufficient buy-in for the process, ineffective communication from practice leadership, and insufficient staff training. In another P4 program, duty hour restrictions led to decreases in visit numbers, hours spent in clinic, and the number of visits per hour for residents in all years of training.40 This 2011 ACGME regulatory change negated the substantial gains this program accomplished in the first 3 years of P4 toward increasing visit numbers and clinic hours for interns.

Innovations around team-based care were a focus for several P4 programs.4 Teamwork and leading teams were identified as core skills for personal physicians in a PCMH.16 Increased attention to team-based care in one P4 program led to better understanding of the role of faculty in role modeling appropriate behaviors and coaching residents about the importance of communication in team huddles.25 Ensuring resident attendance during huddles and expecting residents to be part of the team were crucial steps. Another program studied drop-in group medical visits, provided by an interdisciplinary team, and found that emergency department visits dropped by more than half with a corresponding decline in hospital charges for complex patients who had been high utilizers.35 Finally, an in-depth study of team-based care found that creating a supportive, safe learning environment for this type of training involves using a different model of professional socialization, and tools for building culture.24 Overall, we found that more than 82% of P4 graduates reported being adequately trained in team-based care.18 Graduates were more than five times more likely to join practices with team-based care in place if well trained in this aspect of clinical care.18

We found mixed results in the study of resident ratings of the importance of PCMH features based on exposure during training,17 and resident satisfaction with training was also variably associated with exposure to PCMH features.15 Residents in programs actively changing their practice may have struggled with disruptive forces of innovation, but they still understood and embraced their roles as personal physicians. In analyzing reflections from an online diary system, the majority of resident respondents reported finding meaning in the humanistic and interpersonal aspects of medicine and that being a personal physician in a PCMH meant being the “go-to person for patients’ healthcare needs.”16 Interestingly, information technology and registries that facilitate care were not identified as features of personal doctoring.

Educational Redesign and Assessment

Key Points:

  • Leadership skills, faculty and staff engagement, and a “learning together” approach were all needed in practices that successfully transformed.
  • Innovation attracts more US students to a residency.
  • Innovation and regulation can coexist.
  • Developing versatile and valid competency assessment measures is time consuming and complex.

P4 definitely helped create a cadre of faculty who catalyzed change to improve training for our future primary care workforce. We identified enablers and barriers to implementation of innovations in our qualitative analysis of progress reports, which are included in Table 4. Our assessment of faculty development needs toward implementing the PCMH indicated that early in P4, faculty needed skills in using and teaching how to use EHR features, change management, curriculum design, evaluation, individualized learning plans, career coaching, competency-based assessment, and leadership.10 As the project progressed, a “learning together” approach with residents in transformed practices emerged. Given the pace of change and the evolving nature of the PCMH, faculty partnering with residents to gain new skills was warranted. Additionally, leadership actions important for improving the clinical learning environment while simultaneously transforming resident education included: (1) managing change, (2) developing financial acumen, (3) adapting best evidence educational strategies to the local environment, (4) creating and sustaining a vision that engages stakeholders, and (5) demonstrating courage and resilience.14

Early in P4, concerns were raised that disruptive changes might harm student interest and Match performance. In fact, we found the opposite to be true in an analysis of the effect of curricular innovations on residency applications and Match performance.7 The mean number of US MD senior applicants per program increased from 53 before P4 to 81 after P4 implementation, and the mean percent of positions filled in the Match increased from 73% before P4 to 87% after P4. Programs that implemented individualized training significantly improved the percent of positions filled in the Match compared to those that did not (90% vs 83%). An additional concern was the impact of residency training redesign on residents’ clinical knowledge. An analysis by the Evaluation Center revealed that the in-training examination (ITE) scores of P4 residents were higher compared to national scores in each year, and there was no harm to resident clinical knowledge as a result of curricular changes.19

We also examined the extent to which attempting to innovate within the Accreditation Council for Graduate Medical Education (ACGME) regulatory environment influenced the standing of P4 programs.12 The P4 programs navigated the accreditation process and were able to innovate within the rules without putting their programs at risk. The mean cycle length for all P4 programs was 4.0 before P4 (2007) and did not change; the average number of citations per program before P4 was 6.2, and during P4 it was 6.8, with P4 averages being similar to national norms.

While implementing curricular innovations, the individual P4 programs also took on the task of studying new measures and methods of accomplishing competency assessment. The site-specific work of one program resulted in (1) a detailed description of a comprehensive, developmentally appropriate competency assessment system31 which translated data from checklists of observed behaviors into a “radar graph” useful for both formative and summative assessments37; (2) an innovative method of integrating advising and assessment functions to include the voices of resident learners36; and (3) new insights about the importance of prompting learners to triangulate feedback from multiple sources when performing self-assessment.41 Another program conducted a Delphi process with experts to develop an extensive list of entrustable professional activities (EPAs) for ambulatory practice that provides a roadmap for other programs embarking on competency-based assessment.39

Methods of Financing New Residency Experiences

Key Point:

  • If committed to making change, you can accomplish it, regardless of the size or resources of your program.

P4 residency programs only received funding to participate in the collaborative meetings. They did not receive funding from P4 for their innovations. The lack of programmatic funding was both a strength and weakness. It assured careful decision-making and promoted sustainable strategies under real-world conditions. However, we learned that a clinical operation and a residency cannot be run and innovated without some financial help. The P4 programs were successful in their pursuit of additional funds from a variety of sources. Collectively, the six university-based programs received $5,240,516 over the study period, compared with $4,718,943 received by eight community-based programs.13 Most funding came from grants (58% and 87%, respectively). Training redesign was estimated to add 3% to budgets for university-based programs and about 2% to budgets for community-based programs.13

Length of Training

Key Point:

  • The optimal length of training in family medicine is still unknown and more study is needed.

P4’s evaluation design was a comparative case series, which was not rigorous enough to fully study the impact of length of training on study variables. However, several papers provide new insights about length of training. A report of early outcomes in three P4 programs, that assessed the impact of longer training required for learner-directed diversification, found that residents in these programs chose a variety of areas of concentration and 40% of residents chose to extend their residency training to 4 years.32 An evaluation of one program that integrated the fourth year of medical school with the first year of residency found that the integrated residents performed significantly better than traditional residents on the ITE during each year of residency training, though no differences were found in patient continuity or panel size.22 Additional reports describe curriculum implementation and early outcomes of various 4-year models of training including optional advanced training leading to a degree (eg, MPH),33 the nation’s first comprehensive required 4-year residency,34 and one program with innovative flexible longitudinal tracks.42 Overall, the ITE scores of residents in programs that experimented with length of training were similar to those in programs that did not.19 Graduates exposed to lengthened training, compared to standard training length, were more likely to include adult hospital care (58% vs 39%), adult ICU care (31% vs 19%) and newborn resuscitation (26% vs 14%) in their practice and they performed 19 of 30 procedures at higher rates.20 More study of the optimal length of training in family medicine is needed and is currently underway.45

Scope of Practice

Key Point:

  • Residencies redesigning for our future primary care workforce are still training comprehensive family physicians.

The effect of curricular innovations on the scope of practice of P4 program graduates was an important outcome measurement in the project. Compared to national data, P4 graduates reported higher rates for vaginal deliveries (19% vs 9%), adult inpatient care (49% vs 34%) and nursing home care (25% vs 12%) in practice.20 However, this analysis also revealed that P4 innovations did not significantly change graduate practice scope in the pre- and post-P4 periods.20 Thus, the P4 programs represented a subset of residencies that have historically trained to a broader scope of practice.

Overall, graduates of programs with individualized training innovations reported no significant differences in scope compared to graduates without this innovation.20 One P4 program assessed practice scope of residents who undertook innovative flexible longitudinal tracks, and found that residents who completed a flexible maternal child health track (n=15) compared to all other P4 graduates (n=332) were more likely to deliver babies (87% vs 15%), perform C-sections as primary surgeon (80% vs 5%), care for hospitalized adults (87% vs 44%), and care for hospitalized children (87% vs 34%).42

Setting Standards for Conducting Multisite Educational Research

Key Points:

  • Studying educational effectiveness requires rigorous measures and data collection coupled with educators and researchers working shoulder-to-shoulder to get it right.
  • Collaboratives provide support to foster innovation and an evidence-based approach to educational redesign.
  • A lack of evaluation expertise in residencies exists.

Establishing meaningful measures at the individual resident, program, clinic, and graduate level are crucial and require rigorous instrument testing, using feasible though high standards for data collection.9 Program, continuity clinic, resident, applicant, and Match surveys were collected annually with near 100% data capture, with a resultant comprehensive relational database of 830 unique residents. The graduate survey response rate was more than 70% overall for all years (2006 through 2012).20 An innovative web-based data viewing portal allowed programs to view their data compared to aggregate data from all 14 programs with annual trend graphs for all measures. The Evaluation Center and the programs partnered in meaningful ways to accomplish this level of data sophistication.

In addition to the development and testing of survey measures used by the P4’s Evaluation Center, we also tested and validated a new instrument to measure attributes of family medicine identity in residencies.11 This new measure, sensitive enough to detect developmental changes between residents and experienced family medicine faculty, can be used by researchers to study how family medicine identity might develop differently based on various training models or curricular innovations.

All 14 P4 programs participated throughout the 7-year project and met expectations of reporting and collaboration. They proved to be an imaginative and courageous group, as well as resilient and persistent innovators. The P4 programs formed collaboratively, while maturing throughout the project and came to be a force in and of itself. Being a part of this learning community provided additional motivation and structure for getting things done.46 Programs supported one another, shared ideas and expertise, collaborated on similar curricular elements, and published and presented
together.

Rigorous evaluation in a multiyear, multisite study like P4 is resource intensive, and programs had little bandwidth and lacked skills to accomplish this. Despite the lack of educational research experience at the start, with guidance from the P4 evaluation team, all programs were able to develop and refine hypotheses for their innovations and map them to relevant measures.4 In total, 38 of 44 (86%) innovations undertaken by the 14 programs were partially or fully implemented,46 and programs accomplished partial or full data collection for 63 of 80 (79%) of the hypotheses they proposed. Many site-specific reports demonstrate that the programs used sophisticated measurement and evaluation designs in their innovation experiments, helping to set high standards for collaborative research.

Discussion

P4 was successful in many ways. The participating programs all completed the project without being hindered by accreditation and funding issues. P4’s achievements can refute the claims that family medicine residencies don’t have the time or inclination to take the risks needed to innovate and improve; they do indeed. And we suspect that many of the programs not selected to be in P4 (n=30) undertook innovations, even though they were not selected. Thus, it is likely that other non-P4 programs undertook similar innovations that were not measured or tracked by P4 efforts. Some of these also likely resulted in publication. Importantly, P4 has produced an extensive body of work with many findings showing improvements in training, results that might not have been possible without the extraordinary data capture we were able to achieve—a feat that is often challenging in educational research.

Now more than ever, training must adapt to emerging advanced practice models that are essential in an efficient and effective health care delivery system.47 The lessons learned in P4 on how best to align residency experiences with the PCMH and the importance of emphasizing team-based care will help educators produce the kind of residents capable of adapting to, and even leading, needed changes that will undoubtedly occur in the future. To advance such an ambitious agenda, every residency program should become part of a learning collaborative that works together to test and advance educational improvements. Such an approach would enhance residency faculty’s abilities to evaluate residents and innovations in training, and achieve goals in scholarly work. It would also contribute to the development of new tools, such as better competency-based medical education assessments.49 For example, the graduate survey developed for P4 contributed to the survey now used by the ABFM48 to assess all family medicine graduates every 3 years, yielding important national outcomes for residencies. Federal GME financing should include economic models to support the development of such “collaboratories,” enabled by the American Board of Family Medicine and the Association of Family Medicine Residency Directors.

Inadequate evaluation resources in most residencies and limited funding for educational research has led to curricular changes often being made with little if any evidence to support them. Forming a learning collaborative like the one that occurred in P4 can help drive the graduate medical education research agenda of the future. Such collaboratives engage programs around research questions and appropriate measures, prospectively study how key features change over time, and give programs on-going access to their own data. The power of P4’s educational big data to answer educational effectiveness questions should not be overlooked.

The innovations tested by P4 programs were expected to guide future revisions of family medicine residency requirements. We found evidence that sustaining core skills while flexibly customizing to meet the needs of individual learners did not harm student interest or resident clinical knowledge and scope of practice, and contributed to better understanding of how regulation and innovation can coincide without conflict. The regulatory environment under the current ACGME rules now allows programs that demonstrate high-quality outcomes to gain flexibility to innovate.50

In summary, the P4 project represents a successful change effort for the discipline of family medicine and is an example of a useful model for multisite educational research. The insights gained from the project should help other educators embarking on a path to redesign all family medicine residencies for the future.

Acknowledgments

This work was supported by the Preparing the Personal Physician for Practice (P4) Project, which was jointly sponsored by the American Board of Family Medicine Foundation, the Association of Family Medicine Residency Directors, and the Family Medicine Research Program at Oregon Health and Science University, Portland, OR.

References

  1. Martin JC, Avant RF, Bowman MA, et al; Future of Family Medicine Project Leadership Committee. The Future of Family Medicine: a collaborative project of the family medicine community. Ann Fam Med. 2004;2(suppl 1):S3-S32. https://doi.org/10.1370/afm.130.
  2. Green LA, Jones SM, Fetter G Jr, Pugno PA. Preparing the personal physician for practice: changing family medicine residency training to enable new model practice. Acad Med. 2007;82(12):1220-1227. https://doi.org/10.1097/ACM.0b013e318159d070.
  3. Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Info Libr J. 2009;26(2):91-108. https://doi.org/10.1111/j.1471-1842.2009.00848.x.
  4. Carney PA, Eiff MP, Green LA, et al. Preparing the personal physician for practice (P⁴): site-specific innovations, hypotheses, and measures at baseline. Fam Med. 2011;43(7):464-471.
  5. Carney PA, Eiff MP, Saultz JW, et al. Aspects of the patient-centered medical home currently in place: initial findings from preparing the personal physician for practice. Fam Med. 2009;41(9):632-639
  6. Eiff P, Garvin R, Fogarty CT, et al. A model for a standardized national family medicine graduate survey. Fam Med. 2009;41(5):337-341.
  7. Garvin RD, Eiff MP, Pugno P, et al. A p⁴ report: effect of curriculum innovation on residency applications and match performance. Fam Med. 2011;43(7):472-479.
  8. Carney PA, Green LA. An emerging epidemic of innovation in family medicine residencies. Fam Med. 2011;43(7):461-463.
  9. Carney PA, Eiff MP, Saultz JW, et al. Assessing the impact of innovative training of family physicians for the patient-centered medical home. J Grad Med Educ. 2012;4(1):16-22. https://doi.org/10.4300/JGME-D-11-00035.1.
  10. Eiff MP, Waller E, Fogarty CT, et al. Faculty development needs in residency redesign for practice in patient-centered medical homes: a P4 report. Fam Med. 2012;44(6):387-395.
  11. Carney PA, Waller E, Eiff MP, et al. Measuring family physician identity: the development of a new instrument. Fam Med. 2013;45(10):708-718.
  12. Eiff MP, Garvin R, Green LA, et al. Innovating within the ACGME regulatory environment is not an oxymoron. Fam Med. 2014;46(4):282-287.
  13. Carney PA, Waller E, Green LA, et al. Financing residency training redesign. J Grad Med Educ. 2014;6(4):686-693. https://doi.org/10.4300/JGME-D-14-00002.1.
  14. Kozakowski SM, Eiff MP, Green LA, et al. Five key leadership actions needed to redesign family medicine residencies. J Grad Med Educ. 2015;7(2):187-191. https://doi.org/10.4300/JGME-D-14-00214.1.
  15. Carney PA, Waller E, Dexter E, et al. Association between patient- centered medical home features and satisfaction with family medicine residency training in the US. Fam Med. 2016;48(10):784-794.
  16. Carney PA, Jacob-Files E, Rosenkranz SJ, Cohen DJ. Perceptions of becoming personal physicians within a patient-centered medical home. J Health Edu Res Dev. 2016;4(3):179-184. https://doi.org/10.4172/2380-5439.1000179.
  17. Eiff MP, Green LA, Jones G, et al. Varied rates of implementation of patient centered medical home features and residents’ perceptions of their importance based on practice experience. Fam Med. 2017;49(3):183-192.
  18. Carney PA, Waller E, Dexter E, et al. Team training in family medicine residency programs and its impact on team-based practice post-graduation. Fam Med. 2017;49(5):346-352.
  19. Waller E, Eiff MP, Dexter E, et al. Impact of residency training redesign on residents’ clinical knowledge. Fam Med. 2017;49(9):693-698.
  20. Eiff MP, Hollander-Rodriguez J, Skariah J, et al. Scope of practice among recent family medicine residency graduates. Fam Med. 2017;49(8):607-617.
  21. Webb AR, Young RA, Casey DF, Baumer JG. Matching up with P4. Fam Med. 2008;40(10):692.
  22. Ringdahl E, Kruse RL, Lindbloom EJ, Zweig SC. The University of Missouri integrated residency: evaluating a 4-year curriculum. Fam Med. 2009;41(7):476-480.
  23. Crane S. The role of health information technology in creating networks of medical homes in rural North Carolina. N C Med J. 2009;70(3):256-259.
  24. Miller WL, Cohen-Katz J. Creating collaborative learning environments for transforming primary care practices now. Fam Syst Health. 2010;28(4):334-347. https://doi.org/10.1037/a0022001.
  25. Fogarty CT, Schultz S. Team huddles: the role of the primary care educator. Clin Teach. 2010;7(3):157-160. https://doi.org/10.1111/j.1743-498X.2010.00369.x.
  26. Jones GL, Lima E. The effects of residency practice redesign on providers and staff. Fam Med. 2011;43(7):522-525.
  27. Crane S. Redesigning the rural health center: high tech, high touch, and low overhead. N C Med J. 2011;72(3):212-215.
  28. Dysinger WS, King V, Foster TC, Geffken D. Incorporating population medicine into primary care residency training. Fam Med. 2011;43(7):480-486.
  29. Erlich DR, Shaughnessy AF. Sleeping at home: a new model for a hospital teaching service. J Grad Med Educ. 2011;3(2):243-245. https://doi.org/10.4300/JGME-D-10-00098.1.
  30. Lurie SJ, Schultz SH, Lamanna G. Assessing teamwork: a reliable five-question survey. Fam Med. 2011;43(10):731-734.
  31. Baglia J, Foster E, Dostal J, Keister D, Biery N, Larson D. Generating developmentally appropriate competency assessment at a family medicine residency. Fam Med. 2011;43(2):90-98.
  32. LoPresti L, Young R, Douglass A. Learner-directed intentional diversification: the experience of three P4 programs. Fam Med. 2011;43(2):114-116.
  33. Mazzone M, Krasovich S, Fay D, et al. Implementing radical curriculum change in a family medicine residency: the majors and masteries program. Fam Med. 2011;43(7):514-521.
  34. Douglass AB, Rosener SE, Stehney MA. Implementation and preliminary outcomes of the nation’s first comprehensive 4-year residency in family medicine. Fam Med. 2011;43(7):510-513.
  35. Crane S, Collins L, Hall J, Rochester D, Patch S. Reducing utilization by uninsured frequent users of the emergency department: combining case management and drop-in group medical appointments. J Am Board Fam Med. 2012;25(2):184-191. https://doi.org/10.3122/jabfm.2012.02.110156.
  36. Foster E, Biery N, Dostal J, Larson D. RAFT (Resident Assessment Facilitation Team): supporting resident well-being through an integrated advising and assessment process. Fam Med. 2012;44(10):731-734.
  37. Keister DM, Larson D, Dostal J, Baglia J. The radar graph: the development of an educational tool to demonstrate resident competency. J Grad Med Educ. 2012;4(2):220-226. https://doi.org/10.4300/JGME-D-11-00163.1.
  38. Shaughnessy AF, Gupta PS, Erlich DR, Slawson DC. Ability of an information mastery curriculum to improve residents’ skills and attitudes. Fam Med. 2012;44(4):259-264.
  39. Shaughnessy AF, Sparks J, Cohen-Osher M, Goodell KH, Sawin GL, Gravel J Jr. Entrustable professional activities in family medicine. J Grad Med Educ. 2013;5(1):112-118. https://doi.org/10.4300/JGME-D-12-00034.1.
  40. Lindbloom EJ, Ringdahl E. Resident duty hour changes: impact in the patient-centered medical home. Fam Med. 2014;46(6):463-466.
  41. Keister DM, Hansen SE, Dostal J. Teaching resident self-assessment through triangulation of faculty and patient feedback. Teach Learn Med. 2017;29(1):25-30. https://doi.org/10.1080/10401334.2016.1246249.
  42. Young RA, Casey D, Singer D, Waller E, Carney PA. Early career outcomes of family medicine residency graduates exposed to innovative flexible longitudinal tracks. Fam Med. 2017;49(5):353-360.
  43. Camic PM, Rhodes JE, Yardley L. Qualitative Research in Psychology: Expanding Perspectives in Methodology and Design. Washington, DC: American Psychological Association; 2003. https://doi.org/10.1037/10595-000.
  44. Riessman CK. Narrative Methods for the Human Sciences. Los Angeles, Calif: SAGE Publications; 2008.
  45. Carek PJ. The length of training pilot: does anyone really know what time it takes? Fam Med. 2013;45(3):171-172.
  46. Carney PA, Eiff MP, Waller E Jones SA, Green LA. Redesign of Residency Training: Final Results from P4. Presentation at 2013 Annual STFM Meeting. http://www.stfm.org/Conferences/AnnualSpringConference/PastConferences/PastAbstracts,Brochures,Handouts,Videos/2013STFMAnnualConference?m=6&s=679. Accessed July 18, 2017.
  47. Gupta R, Dubé K, Bodenheimer T. The road to excellence for primary care resident teaching clinics. Acad Med. 2016;91(4):458-461. https://doi.org/10.1097/ACM.0000000000001100.
  48. Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676-682. https://doi.org/10.3109/0142159X.2010.500704.
  49. Mitchell KB, Maxwell L, Miller T. The national graduate survey for family medicine. Ann Fam Med. 2015;13(6):595-596. https://doi.org/10.1370/afm.1874.
  50. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system--rationale and benefits. N Engl J Med. 2012;366(11):1051-1056. https://doi.org/10.1056/NEJMsr1200117.

Lead Author

Patricia A. Carney, PhD

Affiliations: Oregon Health and Science University, Portland, OR

Co-Authors

M. Patrice Eiff, MD - Oregon Health and Science University, Portland, OR

Elaine Waller - Oregon Health and Science University, Portland, OR

Samuel M. Jones, MD - Virginia Commonwealth University-Fairfax Residency Program, Fairfax, VA

Larry A. Green, MD - University of Colorado

Corresponding Author

Patricia A. Carney, PhD

Correspondence: Oregon Health and Science University, 3181 SW Sam Jackson Park Rd MC: FM, Portland, OR 97239. 503-494-9049. Fax: 503-494-2746.

Email: carneyp@ohsu.edu

Fetching other articles...

Loading the comment form...

Submitting your comment...

There are no comments for this article.