ORIGINAL ARTICLES

Applicant Evaluation of Residency Programs in a Virtual Format: A Mixed-Methods Study

Wala Tout, MD | Sonia Oyola, MD, ABOIM | Zakaria Sharif, BS | Emily White VanGompel, MD, MPH

Fam Med. 2022;54(10):804-813.

DOI: 10.22454/FamMed.2022.148473

Return to Issue

Abstract

Background and Objectives: The first all-virtual residency application cycle took place in 2021. Virtual programming can reduce cost, time, and travel burden; these may be especially beneficial to applicants with fewer resources and those from underrepresented backgrounds (URM). Little is known about how applicants evaluate key ranking factors, especially in a virtual format. This study aimed to assess how applicants evaluated programs in the virtual cycle.

Methods: We surveyed 271 fourth-year students at three Chicago medical schools after rank-list submission and prior to receiving match results in March 2021. The survey included questions on online content and importance of different ranking factors as well as open-ended questions on how participants evaluated their most important factors. We analyzed quantitative data using descriptive statistics and χ2 tests. We analyzed qualitative data using thematic content analysis.

Results: Applicants cited goodness of fit, geographic location, program reputation, fellowship opportunities, and work/life balance as the top-five most important factors. URM applicants were more likely to prioritize diversity at institution or location (P<.0001). Interactions with residents and faculty and opportunities to observe interprogram dynamics were key to assessing fit but were often limited by the virtual format. Additional emergent themes provided recommendations for future cycles. Program websites and videos were rated as the most important online content types.

Conclusions: This study provides information about how applicants evaluated the factors they deemed most important in assessing and ranking programs, which can help residency programs improve their recruitment efforts.


In 2021, the Coalition for Physician Accountability, a cross-organizational group of major US medical societies, recommended that all programs commit to exclusively virtual interviews in response to the COVID-19 pandemic.1 Though this decision was unprecedented, some programs have experimented with virtual interviews in the past and many may wish to continue virtual interviews in the future as they offer advantages for both programs and applicants. Chief among these is the savings in cost, reduction in missed education days for students, and increased ability to apply to distant and/or rural programs.2 These advantages may be of particular benefit to students from disadvantaged socioeconomic backgrounds and those with increased family or work obligations.

Therefore, understanding how applicants evaluated and ranked programs in the virtual cycle will help programs prepare for future virtual interviews. Existing information on how applicants have evaluated programs in the past is available, in part, from the National Resident Matching Program (NRMP), a private nonprofit organization that assigns applicants to US residency programs based on a matching algorithm. They quantitatively survey all applicants every 2 years to evaluate the factors used for program selection and ranking.3 In 2019, the most important factors for program evaluation were (a) goodness of fit, (b) interview day experience, (c) desired geographic location, and (d) quality of residents. House staff morale and work/life balance were also ranked highly. While only 31% and 30% of applicants cited cultural/racial/ethnic diversity at geographic location and at the institution, respectively, this was not broken down by racial or ethnic background of respondents.3

For primary care programs, which see a higher percentage of female and minority resident applicants, a deeper understanding of applicants’ differential evaluation processes is needed. The relative importance of evaluation factors may vary significantly by race or gender. For example, underrepresented minorities (URM) and female applicants may be more likely to rank faculty and resident diversity as important in their ranking of programs.4-6

Finally, social media and other online content is likely to play a prominent role in the evaluation of programs in an all-virtual match process. Prior to the 2021 virtual cycle, one study reported that residency-based social media accounts impacted program evaluation about half of the time.7 In another study, 63% of applicants used program websites to prepare for interviews but just 24% rated them as important for finalizing a rank order list.8 It is reasonable to expect that these virtual elements may play an even larger role in an exclusively virtual cycle. As a result, some have recommended that residency programs invest in creating or updating program websites, videos, and social media accounts to increase their digital presence;9,10 however, it is not clear to what extent these time-intensive resources are used by applicants to make ranking decisions. This study aimed to explain and quantify how applicants evaluate residency programs for ranking purposes in the virtual cycle, with additional focus on underrepresented minority applicants, so that programs can improve their recruitment strategies in the future.

Methods

We developed an online survey that consisted of closed- and open-ended questions based on existing research.3,7,11-13 In addition to basic demographic questions, participants were asked if they self-identified as an underrepresented minority. They were then asked to identify, in order, the top three factors they found most important in ranking residency programs. Participants could select from a list of 45 factors taken from the NRMP’s biannual national survey.3 For each of these three factors, an open-ended question asked how they evaluated that factor.

We included Likert-style questions about the importance of different types of content in applicants’ evaluation of programs; these were adapted from similar research with anesthesia applicants.7,12 Finally, participants were asked several questions about ranking and interview statistics adapted from a study of orthopedic surgery applicants.13 The survey was pilot tested with a small group of residents and medical students, with revisions made to clarify understanding. This study was reviewed by the NorthShore Institutional Review Board and deemed exempt from further review on February 2, 2021.

The survey was distributed via email to fourth-year medical students at three Chicago-area allopathic medical schools using the REDCap online survey platform. Only those applying through the 2021 NRMP match cycle were eligible for the study. Students applying to subspecialties that had already matched at the time of the survey were excluded. The survey links were sent out on March 4, 2021, after the deadline for rank list submission. The survey closed just prior to Match day on March 18, 2021. Emails were sent by medical student or faculty liaisons at each institution to maximize recruitment. Students had the option to submit their email address upon completion of the survey to receive a $15 gift card after the survey closed.

Quantitative Analysis

We generated descriptive statistics for demographic variables and quantitative questions using STATA version 15.1.14 For the ranking factors, we generated a table of frequencies and the top-five most frequently selected factors were chosen for qualitative analyses of the open-ended responses. We used χ2 test to compare distribution of female and URM participants between those who selected cultural/racial/ethnic diversity at geographic location or institution as an important factor and those who did not. We then ran χ2 tests to analyze differences in ranking factor selection by gender and URM status as well as family medicine versus other applicants; factors selected by fewer than five participants were excluded from these analyses. We considered P values less than .05 significant. We performed statistical analyses using SAS version 9.4.15

Qualitative Analysis

The first set of ranking factors were analyzed by all four authors who generated codes inductively, using a line-by-line process. We independently analyzed comments and applied codes, generating new codes when necessary, and met regularly to iteratively resolve coding discrepancies. Subsequently, the remaining factors were analyzed by two of the authors (W.T. and Z.S. or W.T. and E.W.V.). All four authors met to organize codes into themes and subthemes. To ensure credibility of findings, we used several best practices for qualitative analyses. All members of the research team had engagement with the content area of medical education and the NRMP match process; specifically, as medical school faculty and applicant mentor (S.O.), residency faculty (E.W.V.), current resident and recruitment chair (W.T.), and current medical student and future applicant (Z.S.). Using, at minimum, two data coders for every transcript minimized individual bias. We used Dedoose 9.0.46 software16 to track codes and the movement from codes to themes to ensure consistency and document decisions.

Results

Characteristics of Participants

A total of 271 responses were received, representing a 56.5% response rate. The sample included 133 males and 138 females (49.1% and 50.9%, respectively). The majority of participants were aged 25-29 years (n=224, 82.7%). One-quarter identified as underrepresented minorities in medicine (n=70, 25.9%). The five most popular specialties among our sample were, in order from most to least common: internal medicine, pediatrics, emergency medicine, family medicine, and general surgery. A summary of sample characteristics is shown in Table 1.

Ranking Factors

Overall, goodness of fit was the most frequently cited ranking factor, with 67.5% (n=183) of participants identifying this as one of their top-three most important factors used to evaluate and rank programs. Desired geographic location was also in the top three for more than half of our sample (n=149, 55.0%). The next three most cited factors were, in order: reputation of the program (n=44, 16.2%), future fellowship training opportunities (n=39, 14.4%), and work/life balance (n=39, 14.4%). The top 18 factors are presented in Table 2.

Similar to the overall sample, a majority of applicants to family medicine selected goodness of fit (n=23, 85.2%) and geographic location (n=14, 51.9%) as an important ranking factor. Work/life balance was the third most cited factor, selected by 22.2% (n=6) of these applicants. Additionally, applicants to family medicine were more likely to select community-based setting as an important factor, compared with all other applicants (P=.001); none selected fellowship training opportunities as an important factor, although this difference was not statistically significant (P=.05). Otherwise, they did not differ significantly from the study sample.

Participants identifying as URM were more likely to select cultural/racial/ethnic diversity at geographic location or institution as an important factor (P<.0001). They were less likely to select academic medical center program as important (P=.019). Similarly, those identifying as female were more likely to select diversity at the institution, though not at geographic location, as an important factor (P=.003). They were less likely than males to select balance between supervision and responsibility as an important factor (P=.003). There were no other significant differences in rank factor selection.

For all factors analyzed, between 75% and 89.7% of respondents provided open-ended comments on how they evaluated each factor. We have detailed the findings of the thematic analysis for the top five ranking factors below. These are summarized along with exemplary quotes in Table 3.

Goodness of Fit (GOF)

Although many elements of the program and interview day were cited among these responses, three key themes emerged.

GOF Theme 1: Interactions With Residents and Faculty. Applicants described relying heavily on their own interactions with residents and faculty at interviews to determine how well they might fit in at a program. More casual or informal discussions were felt to be particularly helpful in offering a more accurate or genuine view of individual personalities or overall culture. This was seen as difficult to assess remotely.

GOF Theme 2: Observing Resident Cohesion/Dynamics With Other Residents and Faculty. Applicants appreciated opportunities to observe residents and faculty interacting with each other as a way to assess program culture. For example, they valued watching residents interact with each other during virtual interview dinners or socials and tried to assess how well they seemed to know and like each other.

GOF Theme 3: Applicant’s Affective State or “Vibe”/“Gut Feeling.” Applicants described relying on a gut feeling or “vibe” to judge their fit with programs rather than an overall score or overview of individual elements. They tried to assess their affective state after interviews or program social events and to visualize themselves working in the program environment.

Geographic Location

About half of applicants designating this factor explicitly cited proximity to home, family, or friends in their explanation of how they evaluated geographic location. Two key evaluation themes emerged. First, applicants primarily relied on their own research into or preexisting knowledge about a location rather than information from the program. Second, many sought a program in a large city or urban environment, which were seen as offering more opportunities for leisure and greater diversity with increased chances of finding members of one’s own affinity groups.

Program Reputation

Applicants described relying heavily on explicit program rankings such as those published by Doximity17 or US News and World Report18 to gauge program reputation. They also relied on the opinions of attending and resident physicians at their home institution.

Future Fellowship Training Opportunities

Applicants described looking at the number or proportion of a program’s residents and alumni who had matched to fellowships. They found it helpful to have this information easily accessible, for example on the program website. A few also cited availability of their fellowship of interest at the residency institution.

Work/Life Balance

Two key themes emerged from this analysis. First, total work hours and distribution were very important. Applicants gathered this information by asking direct questions about the typical residency schedule, including duty hours and call schedule, on the interview day. Second, they tried to gauge resident happiness and satisfaction through both group and one-on-one interactions. Some applicants described checking to see how many existing interns showed up for program interview events and how tired they appeared on camera.

Diversity at Institution and Location

Cultural/racial/ethnic diversity at institution and cultural/racial/ethnic diversity at geographic location were the 10th and 12th most frequently cited factors, respectively. In evaluating diversity at institution, applicants described attempts to quantify the number of residents and faculty “of color” at the institution by observing those present at interview days and social events as well as reviewing online rosters. In some cases, they asked specific questions about support for URM trainees or paid specific attention to whether residents and faculty of color were represented in positions of leadership or power. To assess diversity at location, applicants described paying attention to the diversity of the local population and specifically that of the patient population served by program trainees.

Additional Themes

Ninety-one applicants provided additional comments in response to an optional open-ended question at the end of the survey and we analyzed these for additional themes.

Perspective on Virtual Interactions. Many applicants reflected on their opinion of the virtual process, including both interviews and other programming such as social events. A majority commented that they found the process challenging, describing it as “awkward” and limited in opportunity to assess interpersonal dynamics or fit. Some tried to arrange one-on-one phone calls with residents to compensate. Among those who described the process as positive, they cited advantages including time and cost savings, and felt the virtual format was sufficient to evaluate programs. Other applicants described mixed feelings on the virtual process. Several respondents described concern that the virtual process exacerbated unequal interview distribution or hoarding among applicants.

Suggestions for Improvement. Many applicants provided suggestions for improvement. Applicants appreciated program websites with accurate and complete information they could reference before and after interviews. Technological errors during interview day were often seen as reflecting poorly on the program’s organization and quality. There was wide variation in perceived value of virtual social events, such as dinners with residents. Applicants preferred when these events were well organized, interactive, and had a higher number of residents present. Many applicants suggested that programs could or should continue virtual interviews in the future but provide optional in-person social or second-look events.

Quantitative Survey Questions: Online Content

A majority of participants used the various types of online content provided by programs. For example, nearly all participants (n=261, 99.6%) used program videos to evaluate programs, with 46.7% (n=122) rating them as very important or extremely important in evaluating a program. Similarly, over 90% of participants used residency program websites and handouts as well as information found on other medical sites and forums (Table 4). In comparison, 85.9% (n=225) of applicants relied on official program social media accounts, with just 16.4% (n=37) rating these as important. Only 66.4% (n=174) of participants used social media accounts of individual residents or faculty and a minority (6.3%, n=11) felt they were important.

Among all resources, residency program websites were the most valuable, with 57.9% (n=150) of applicants rating these as either very important or extremely important in evaluating a program.

Discussion

In this study, we surveyed applicants of the 2021 NRMP Match to determine how they evaluated programs in the first-ever completely virtual residency application cycle. The top five most frequently cited ranking factors in our study were: goodness of fit, geographic location, program reputation, fellowship training opportunities, and work/life balance. This list is similar to the national sample of US allopathic medical school (MD) seniors as published in the 2021 NRMP report.19 Surprisingly, it is also similar to the list published in the 2019 NRMP report as well as several other studies all taking place prior to the implementation of virtual application cycles.3-6 Although applicants could not observe or evaluate programs the same way in a virtual cycle, they still prioritized similar factors in their decision-making. Specifically, goodness of fit remained the top-ranking factor, deemed more important than more tangible or easily defined variables. Applicants likely evaluated this factor differently in a virtual cycle, however direct comparisons cannot be drawn as our study went further than past literature by asking applicants to reflect on how they evaluated these key ranking factors.

Applicants valued opportunities to interact with many residents and to watch residents and faculty interact with each other so they could assess program culture and fit. These findings emphasize the importance of providing opportunities for applicants to observe such interactions in small groups. It is challenging to make a definite conclusion regarding virtual social events, as these were particularly polarizing. Some felt they could accurately evaluate programs and appreciated the convenience of a remote process. However, many found it challenging to assess interpersonal dynamics and cited frustration with low attendance, technical errors, and poor organization. Several applicants suggested that in the future, programs could offer virtual interviews with the option for in-person visits or second-look events for interested applicants.

The preference for diversity among female and URM applicants demonstrates that programs need to critically evaluate their existing diversity and strive to recruit and retain URM faculty and residents. Intentional efforts have been shown to be successful even in an all-virtual format.20,21 Websites should be updated with accurate faculty and resident rosters that do not over- or underrepresent diversity. The preference for location diversity was echoed in descriptions of geographic location evaluation in general, with a preference for larger metropolitan areas that could be relied upon to have greater diversity. It should be noted that our sample was recruited from Chicago-area medical schools and thus, this preference may not be shared by students from suburban or rural schools. Applicants also prioritized diversity of their prospective patients, thus programs should provide accurate information on patient population and demographics.

Virtual interviews may mitigate resource disparities, but may also exacerbate them. Several applicants noted that virtual interviews reduced costs and travel burden and leveled the playing field for those with fewer resources. However, there were also concerns that eliminating these barriers increased interview hoarding by some applicants who applied to and interviewed at more programs. The number of applications submitted each year has been steadily rising in most specialties, including a modest increase in family medicine in 2021.22 Efforts to reduce number of applications and limit interview hoarding are challenging and require specialty-wide coordination, yet they have the potential to significantly decrease the anxiety, inefficiency, and cost of the process.23 Also, while rural programs theoretically benefit from lower costs and expanded reach, the virtual process may also limit their success in recruiting candidates. In the absence of on-the-ground experiences and in-person interactions, applicants may view urban, well-known, or familiar settings as more reliable or safe choices.

Our quantitative survey findings indicate that program websites are among the most important sources of information for applicants; however, applicants’ information needs may not be met by existing websites,24-26 which was echoed in our qualitative findings. We recommend that programs invest in improvements to websites to ensure that they offer clear and accurate information on curriculum, faculty and resident rosters, typical resident schedule, and the fellowship and job matches of recent alumni. Many residency programs, including those in family medicine, created new social media accounts after the start of the pandemic in 2020.27-29 However, though many of our participants accessed this content, few found it important in evaluating programs.

Our study has limitations. It consisted exclusively of US MD seniors, who made up 52.4% of all filled PGY-1 positions and 36.1% of all filled family medicine slots in 202130,31; thus, these findings may be less applicable to a non-US, non-MD applicant pool. Our participants were recruited from three Chicago-based medical schools, and may not be representative of MD seniors in other parts of the country. Additionally, the impact of the pandemic itself on the experience of the application process cannot be isolated from our findings.

Our study surveyed graduating US MD seniors in the first virtual recruitment cycle and found that applicants prioritized many of the same ranking factors as applicants in prior cycles. Our qualitative findings provide more insight into applicant decision-making by revealing common themes in how they assessed these ranking factors. Programs should consider continuing to offer virtual interviews in the future as they provide cost and time savings to both programs and applicants. However, they may need to hone existing strategies, including updating websites with accurate information to support applicant decision-making, offering in-person dinners or second-look events when possible, prioritizing opportunities to observe resident-resident and resident-faculty interactions, and critically evaluating their presentation of diversity. Future research should survey applicants from diverse educational and geographic backgrounds, and could also evaluate satisfaction and regret among current residents who matched through the virtual process.

Acknowledgments

The authors acknowledge Lavisha Singh, MPH, at the NorthShore University HealthSystem Research Institute for her assistance with statistical analysis.

Financial Support: This study was supported by a grant from the NorthShore University HealthSystem Outcomes Research Network (ORN) Student/Resident/Fellow (SRF) Small Project Grant Fund.

Presentations: This study was presented as a work in progress at the virtual annual meetings of the North American Primary Care Research Group in November 2020 and 2021. It was also presented at the Society of Teachers of Family Medicine Annual Spring Conference in Indianapolis, Indiana in May 2022.

References

  1. The Coalition for Physician Accountability’s Work Group on Medical Students in the Class of 2021 Moving Across Institutions for Post Graduate Training. Final Report and Recommendations for Medical Education Institutions of LCME-Accredited, US Osteopathic and Non-US Medical School Applicants. Accessed November 29, 2020. https://physicianaccountability.org/wp-content/uploads/2020/05/Workgroup-C-2020.05.06-Final-Recommendations_Final.pdf
  2. Edje L, Miller C, Kiefer J, Oram D. Using Skype as an alternative for residency selection interviews. J Grad Med Educ. 2013;5(3):503-505. doi:10.4300/JGME-D-12-00152.1
  3. National Resident Matching Program, Data Release and Research Committee. Results of the 2019 NRMP Applicant Survey by Preferred Specialty and Applicant Type.National Resident Matching Program; 2019.
  4. Phitayakorn R, Macklin EA, Goldsmith J, Weinstein DF. Applicants’ self-reported priorities in selecting a residency program. J Grad Med Educ. 2015;7(1):21-26. doi:10.4300/JGME-D-14-00142.1
  5. Kroin E, Garbarski D, Shimomura A, Romano J, Schiff A, Wu K. Gender differences in program factors important to applicants when evaluating orthopaedic surgery residency programs. J Grad Med Educ. 2019;11(5):565-569. doi:10.4300/JGME-D-18-01078.1
  6. Huntington WP, Haines N, Patt JC. What factors influence applicants’ rankings of orthopaedic surgery residency programs in the National Resident Matching Program? Clin Orthop Relat Res. 2014;472(9):2859-2866. doi:10.1007/s11999-014-3692-9
  7. Renew JR, Ladlie B, Gorlin A, Long T. The impact of social media on anesthesia resident recruitment. J Educ Perioper Med. 2019;21(1):E632.
  8. Chu LF, Young CA, Zamora AK, et al. Self-reported information needs of anesthesia residency applicants and analysis of applicant-related web sites resources at 131 United States training programs. Anesth Analg. 2011;112(2):430-439. doi:10.1213/ANE.0b013e3182027a94
  9. Bhayani RK, Fick L, Dillman D, Jardine DA, Oxentenko AS, O’Glasser A. Twelve tips for utilizing residency program social media accounts for modified residency recruitment. MedEdPublish. 2020;9(1):1-21.
  10. Ashrafzadeh S, Nambudiri VE. Fostering certainty in an uncertain era of virtual residency interviews. J Grad Med Educ. 2020;12(5):561-565. doi:10.4300/JGME-D-20-00503.1
  11. Jena AB, Arora VM, Hauer KE, et al. The prevalence and nature of postinterview communications between residency programs and applicants during the match. Acad Med. 2012;87(10):1434-1442. doi:10.1097/ACM.0b013e31826772a6
  12. Chu LF, Young CA, Zamora AK, et al. Self-reported information needs of anesthesia residency applicants and analysis of applicant-related web sites resources at 131 United States training programs. Anesth Analg. 2011;112(2):430-439. doi:10.1213/ANE.0b013e3182027a94
  13. Ramkumar PN, Navarro SM, Chughtai M, Haeberle HS, Taylor SA, Mont MA. The orthopaedic surgery residency application process: an analysis of the applicant experience. J Am Acad Orthop Surg. 2018;26(15):537-544. doi:10.5435/JAAOS-D-16-00835
  14. StataCorp. Stata Statistical Software: Release 15.StataCorp LLC; 2017.
  15. SAS Institute Inc. SAS/ACCESS® 9.4 Interface to ADABAS: Reference.SAS Institute Inc; 2013.
  16. Version D. 9.0.46, web application for managing, analyzing, and presenting qualitative and mixed method research data.SocioCultural Research Consultants, LLC; 2021, www.dedoose.com.
  17. Doximity: The Medical Network. https://www.doximity.com/. Accessed January 10, 2022.
  18. Find the Best Medical Schools. US News and World Report. Accessed January 10, 2022. https://www.usnews.com/.
  19. National Resident Matching Program, Data Release and Research Committee. Results of the 2021 NRMP Applicant Survey by Preferred Specialty and Applicant Type.National Resident Matching Program; 2021.
  20. Hoff ML, Liao NN, Mosquera CA, et al. An initiative to increase residency program diversity. Pediatrics. 2022;149(1):e2021050964. doi:10.1542/peds.2021-050964
  21. Stoesser K, Frame KA, Sanyer O, et al. Increasing URiM family medicine residents at University of Utah Health. PRiMER Peer-Rev Rep Med Educ Res. 2021;5:42. doi:10.22454/PRiMER.2021.279738
  22. Statistics ERAS. AAMC. Accessed April 1, 2022. https://www.aamc.org/data-reports/interactive-data/eras-statistics-data
  23. Hammoud MM, Winkel AF, Strand EA, et al. Stakeholder perspectives on standardizing the residency application and interview processes. J Surg Educ. 2021;78(4):1103-1110. doi:10.1016/j.jsurg.2020.11.002
  24. Ashack KA, Burton KA, Soh JM, et al. Evaluating dermatology residency program websites. Dermatol Online J. 2016;22(3):13030/qt7rx3j2dn. doi:10.5070/D3223030367
  25. Patel SJ, Abdullah MS, Yeh PC, Abdullah Z, Jayaram P. Content evaluation of physical medicine and rehabilitation residency websites. PM R. 2020;12(10):1003-1008. doi:10.1002/pmrj.12303
  26. Stoeger SM, Freeman H, Bitter B, Helmer SD, Reyes J, Vincent KB. Evaluation of general surgery residency program websites. Am J Surg. 2019;217(4):794-799. doi:10.1016/j.amjsurg.2018.12.060
  27. Pasala MS, Anabtawi NM, Farris RL, et al. Family medicine residency virtual adaptations for applicants during COVID-19. Fam Med. 2021;53(8):684-688. doi:10.22454/FamMed.2021.735717
  28. Clay Pruett J, Deneen K, Turner H, et al. Social media changes in pediatric residency programs during COVID-19 pandemic. Acad Pediatr. 2021;21(7):1104-1107. doi:10.1016/j.acap.2021.06.004
  29. Bram JT, Jia L, Huffman W, Ahn J. Orthopaedic surgery residency program social media presence during the COVID-19 pandemic. JB JS Open Access. 2021;6(4):e21.00073.
  30. National Resident Matching Program. Results and Data: 2021 Main Residency Match(R).National Resident Matching Program; 2021.
  31. American Academy of Family Physicians. 2022 Match Results for Family Medicine. Accessed March 30, 2022. https://www.aafp.org/dam/AAFP/documents/medical_education_residency/the_match/AAFP-2022-Match-Results-for-Family-Medicine.pdf

Lead Author

Wala Tout, MD

Affiliations: University of Chicago NorthShore Family Medicine Residency Program, Glenview, IL

Co-Authors

Sonia Oyola, MD, ABOIM - University of Chicago Pritzker School of Medicine, Chicago, IL

Zakaria Sharif, BS - University of Illinois College of Medicine, Chicago, IL

Emily White VanGompel, MD, MPH - University of Chicago NorthShore Family Medicine Residency Program, Glenview, IL | University of Chicago Pritzker School of Medicine, Chicago, IL | and NorthShore University HealthSystem Research Institute, Evanston, IL

Corresponding Author

Wala Tout, MD

Correspondence: 1001 University Pl, Evanston, IL 60201. 224-364-7303. Fax: 224-364-7319.

Email: toutwala@gmail.com

Fetching other articles...

Loading the comment form...

Submitting your comment...

There are no comments for this article.

Downloads & Info

Share

Related Content