Background and Objectives: Family medicine implemented program signals and geographic and setting preferences in the 2023–2024 residency application cycle. We performed a qualitative study with the following aims: (a) describe residency program experiences with implementation of signaling and preferences; and (b) identify opportunities for applicants, advisors, residency leadership, and policymakers to optimize these two programs.
Methods: This qualitative study used the RE-AIM framework (reach, effectiveness, adoption, implementation, and maintenance) to guide interviews of family medicine program faculty from the Midwest United States between January and April 2024. We analyzed data using a thematic analysis.
Results: We interviewed 21 faculty members. About half of respondents somewhat or strongly agreed that program signals (10, 48%) and geographic and setting preferences (11, 52%) added value to the current system. We identified four themes: (1) Faculty adopted signals and preferences strategically to complement their existing application review strategies; (2) Signals were perceived as reducing application volume and burden; (3) Signals did not impact diversity and equity, but geographic preferences may benefit community health; (4) Modifications to signals and preferences are recommended to optimize use in family medicine.
Conclusions: Program faculty implemented signals and preferences into holistic review to reduce application review burden. Signals and preferences should support the unique experiences of family medicine residencies and needs for primary care physician workforce development. Future research should focus on refining signals and preferences and their impact on match outcomes and Supplemental Offer and Acceptance Program participation rates.
In the context of significant recent increases in application volume and burden on residency programs, the Association of American Medical Colleges (AAMC) introduced program signals to allow residency applicants to express genuine interest in training programs. 1-3 Geographic and setting preferences allow applicants to share up to three of nine U.S. census divisions and rural, suburban, and urban settings with free-text narrative explanations of their selections. 4 Program signaling was first used by otolaryngology during the 2020–2021 residency application cycle 2 and expanded to a total of 22 specialties in the 2022–2023 season. Studies evaluating the impact of program signaling in other specialties have identified an association with decrease in application volume, increased interview offers, 5, 6 and largely positive experiences from applicants and residency programs related to incorporating signaling into holistic review to identify residents who may be a good fit for a program. 7
Family medicine adopted program signaling in the most recent 2023–2024 application cycle and allows applicants five signals, similar to pediatrics (5 signals) and internal medicine (7 signals in 2023–2024). Family medicine is the largest and most widely distributed primary care specialty in the United States, with 92% of residency program graduates practicing primary care. 8 While family medicine residency programs and training slots are expanding to address a shortage of primary care physicians crisis, programs are challenged by increases in unfilled residency positions in comparison to higher paying medical specialties. 9 Because the majority of family physicians work within 100 miles of their residency training site 10 and serve the diverse needs of underserved populations and communities, the unique impact of program signaling and geographic preferences on family medicine training outcomes may differ from other specialties. A preimplementation study of family medicine program directors indicated that program signals were perceived as a positive factor in holistic review and that geographic ties were also important in selecting interview candidates. 11 While signals and preferences are promising evidence-based interventions to assist programs in selecting appropriate residency candidates, little is known about family medicine program experiences, specifically how and why they chose their implementation strategy and the impact on other outcomes of residency training, such as diversity and community health.
To address this gap, we performed a qualitative study to understand challenges and identify and share best practices. Our primary aim was to describe program experiences with implementation of signaling and geographic and setting preferences. Our secondary aim was to identify opportunities for applicants, advisors, programs, and policymakers to optimize the use of signals and preferences.
Study Design and Theoretical Framework
This was a qualitative study of program directors and other faculty responsible for recruitment at Midwest U.S. family medicine residency programs. The study was informed by the RE-AIM (reach, effectiveness, adoption, implementation, and maintenance) framework. We conceptualized program signaling and geographic and setting preferences as evidence-based innovations from other specialties, now being implemented in family medicine. We selected questions from each RE-AIM domain relevant to our research question: reach, to understand the target population signaling might help; effectiveness, to understand their perceived outcomes of implementation; adoption, to understand how they incorporated signaling and preferences; and maintenance, to derive recommendations for modifications. This study was deemed not regulated by the Institutional Review Board of the University of Michigan (HUM00243040) and Rush University (Non-Human Subjects 118). The description of this study followed the consolidated criteria for reporting qualitative research. 12
Setting and Participant Recruitment
The study originated within the Family Medicine Midwest Foundation (FMMF) Scholarly Activity Collaborative. The aim of FMMF is to build a strong family medicine workforce to provide high-quality, comprehensive care for the people of our region. We recruited family medicine residency program directors and faculty from the existing FMMF database of program directors in this region, which aligns geographically with the AAMC North Central area. 4 We emailed 117 of 118 program directors with available emails up to three times between January and March 2024.
A total of 24 respondents indicated interest in participation in a qualitative interview and completed a screening questionnaire with residency program and respondent demographic data via a Qualtrics survey. We added additional screening questions to inform the directionality of the qualitative interview guide (Appendix A). We then used a maximum variation purposeful sampling technique 13 to ensure that subjects represented characteristics of interest (state, rurality, community, program size, and academic status). A total of 21 respondents completed an interview (88%). Three respondents were not able to schedule an interview due to either time constraints or declination to participate.
Qualitative Data Collection and Analysis
We integrated key constructs from the RE-AIM framework 14 guided by answers to the screening questionnaire into a 15-item semistructured qualitative interview guide (Appendix B). Two team members (L.O., L.A.) conducted 21 interviews between January and April 2024. Following verbatim transcription, team members (L.O., L.A., L.H., J.P., S.W.) open-coded the initial interviews using Dedoose software. Two team members (L.H., L.O.) created a codebook informed by RE-AIM constructs. After 10 interviews, no additional codes were identified; additional interviews were conducted to achieve maximum diversity of participant characteristics. We analyzed the data using thematic analysis due to its flexibility in supporting inductive and deductive coding and its use of systematic engagement with the data to iteratively construct latent themes from the coded data. 15 The team convened regularly and used discussion to refine the final themes. 16
Research Team and Reflexivity
Our study team included family medicine educators with experience in academic and community settings and with primary degrees in both medicine and health professions education. Both interviewers (L.O., L.A.) have extensive experience in conducting qualitative research and interviewing techniques. We attempted, when possible, to assign an interviewer that had no existing relationship with the interviewee.
Of the 21 participants, the greatest percentage of respondents were from Illinois (7, 33%), identified their role as program director (15, 71%), and represented a community-based university-affiliated program (13, 62%). Many programs served medically underserved populations (16, 76%), and about half served urban (12, 49%) or suburban (11, 49%) areas. On average, respondents allocated 26% of interview spots to candidates who sent a preference signal, with a range from 5% to 60%. About half of respondents somewhat or strongly agreed that program signals (10, 48%) and geographic and setting preferences (11, 52%) add value to the current system. Few strongly or somewhat agreed that program signals (3, 14%) and geographic and setting preferences (1, 5%) will have a positive impact on equity, diversity and inclusion. Additional respondent characteristics are shown in Table 1.
Demographics
|
n
(%)
|
State
|
21 (100)
|
Illinois
|
7 (33)
|
Michigan
|
4 (19)
|
Wisconsin
|
5 (24)
|
Minnesota
|
2 (10)
|
Kansas
|
2 (10)
|
Missouri
|
1 (5)
|
Role
|
|
Program director
|
15 (71)
|
Associate program director
|
3 (14)
|
Other
|
3 (14)
|
Program type
|
|
Community-based, university affiliated
|
13 (62)
|
Community-based, nonaffiliated
|
3 (14)
|
University-based
|
5 (24)
|
Program characteristics
|
|
Serves medical underserved population
|
16 (76)
|
Serves medically underserved area
|
11 (52)
|
Serves primary care health professional shortage areas
|
5 (24)
|
Teaching health center program
|
3 (14)
|
Program community*
|
|
Located in an urban area
|
12 (57)
|
Located in a suburban area
|
11 (52)
|
Located in a rural area
|
3 (14)
|
Signaling and preference data
|
Average (SD) or
n
(%)
|
Percent of interview spots allocated to candidates that sent a preference signal
|
27 (14.3%)
|
Program (preference) signaling is a positive change to the current system (strongly or somewhat agree)
|
11 (52%)
|
Program (preference) signaling added value to our current recruitment process (strongly or somewhat agree)
|
10 (48%)
|
Program (preference) signaling has a positive impact on equity, diversity, and inclusion (strongly or somewhat agree)
|
3 (14%)
|
Geographic and setting preferences are a positive change to the current system (strongly or somewhat agree)
|
11 (52%)
|
Geographic and setting preferences added value to our current recruitment process (strongly or somewhat agree)
|
11 (52%)
|
Geographic and setting preferences have a positive impact on equity, diversity, and inclusion (strongly or somewhat agree)
|
1 (5%)
|
Qualitative Themes
We identified four themes with examples in Table 2: (1) Program faculty adopted the signaling programs strategically to complement their existing application review strategies; (2) Respondents perceived that program signals were implemented to reduce the volume of applications and lessen the associated burden; (3) Program signals did not impact diversity and equity, but geographic preferences may benefit community health; and (4) Modifications to the signaling program are recommended to optimize use in primary care specialties.
Theme/Domain
|
Subtheme
|
Quote
|
Theme 1: Program directors used the signaling programs strategically to complement their existing application review strategies. Domain: Implementation.
|
Limited use
|
“Because it was the first year that this was allowed, we didn’t really know what to do with it. So, we didn’t want to place a lot of emphasis on it. And then, also, it was to give equal opportunity to, like, everybody we were interviewing.” (P04) “Yeah, I think because this is the first year, I don’t think we placed a huge level of importance on those signals. We didn’t use them, like, as a cutoff or anything like that.” (P15)
|
Part of holistic review
|
“I had a pretty decent idea how we were going to incorporate signaling into our process. We do holistic review and utilize a scoring scheme or rubric to go through the ERAS applications. This was just one additional element out of many, many, many that got considered, and I was happy to have it.” (P19)
|
Identify genuine interest
|
“[It] has made it easier for us to figure out which applicants are genuinely interested in our program. We were definitely highly motivated to talk to the people that we thought were the most interested in our program.” (P01) “Oh, I thought it was great that they added that. I’ve always felt like I could use a lot more information to separate one person from another. And I think the other thing we always struggle with is trying to figure out who actually is interested in my program. . . . Signals are going right in there at the front. You’ve only got five of them, and you use one on me. That means something to me. And, so, it didn’t get you points later, but it got your foot in the door where I was going to at least review your application.” (P18)
|
Stratify middle, lower tier, and wait list
|
“I would describe how we went about using the signals as a way to slightly further inform our decisions of who to offer interviews to in the lower tier of applicants when it got to the point of the folks that were not clearly ones we were absolutely going to want to invite for interviews, but in that big sort of like middle of the road, other tier of applicants, where everybody looks almost essentially the same on our scoring rubric. And you’ve got some that maybe signaled us and some that didn’t, and trying to prioritize who is really likely to accept an interview invitation. Probably somebody that signaled us. So, as long as the scores were within a similar range, we prioritize invitations in that way.” (P20) “And so I think we were being a little bit generous with some of them to say, ‘Oh, you know they are not the best, but they signaled us, so let’s give him a chance.’ . . . But I think we just have to be a little bit more objective in terms of, like, what’s our interview criteria? And maybe it has the potential to work. But I felt like we ended up dinging a lot of people that signaled us.” (P03)
|
Theme 2: Signals were perceived as reducing application volume and burden. Domain: Effectiveness.
|
Application volume decreased
|
“Hey! Let’s really, you know, apply to where you really feel like we see yourself going. And, so, I think that that’s reflected in the numbers decreasing of applications that come through.” (P14)
|
Applicant quality unchanged
|
“It wasn’t a significant change, but we had perhaps a slightly decreased number of applicants this year, and despite that the quality of the top applicants was exactly the same.” (P19)
|
Burden on staff reduced
|
“My impression is that the problem folks were trying to solve was trying to identify those applicants that had true interest in a program, and as opposed to being just part of the application bloat and the wide net that applicants are casting.” (P19) “I think the interview process is so manpower intensive. It’s a huge number of hours during interview season. It’s always very stressful for our faculty. So, I think if this is developed more it could eventually be used to sort of reduce the number of hours that we’re spending interviewing and recruiting people who are really not interested in coming to us.” (P15)
|
|
|
Theme/domain
|
Subtheme
|
Quote
|
Theme 3: Program signals did not impact diversity and equity, but geographic preferences may benefit community health. Domain: Effectiveness.
|
Identifying fit with program mission
|
“So sometimes you could sense a value of community engagement, of serving the underserved, of some of those pieces that align well with our programs, mission, and vision. And so there’s a lot of them you can’t read anything into. But sometimes you get those kind of cues that somebody has values that align.” (P11)
“We did not want to alter substantially the type of applicants that we were interviewing, so we didn’t want to place preference on signaling and then end up interviewing a lot of individuals who maybe don’t really fit with the mission and vision of the program. So, we used our application screening process to identify individuals who maybe were more in line with the mission and vision of the program. And then went from there. And you know, obviously, you’re going to have, I mean, we end up with scores from 2 to whatever, right, so some of the people as we go down the list, then, are not as in line with the mission and values of the program, which is when we took the signaling into consideration.” (P09)
|
Prioritizing diversity and URM applicants
|
“The problem that we are trying to solve is diversity and getting underrepresented folks into our residency. So how can we do that? Signaling is not gonna solve that problem for us. But that’s the problem that at least I was trying to solve.” (P05)
|
Geographic ties to community
|
“I think the geographic is kind of helpful, because you often wonder, like, why is this person from Canada, who went to medical school in Florida, wanting to come to the Midwest, so that can sometimes allow them to, like, say, like, ‘Oh, my partner’s family is here,’ or things. Sharing that without asking questions that sometimes we don’t want to ask, because we don’t want to get into restricted question territory.” (P17)
“I thought I actually gained a little bit more insight into some candidates from the geographic setting preferences based on their free text input that they could include about what some of their values are and how that did or didn’t align with our program, or people who put no signal and would expound upon that.” (P11)
|
Theme 4: Adaptations to the signaling program are recommended to optimize use in primary care specialties. Domain: Effectiveness.
|
Optimal number of family medicine signals
|
“I think five signals is quite a bit. I think it would be more useful to us with a fewer number of signals, and we could know that it’s really meaningful. But I think when you look at the average candidate and family medicine, most of them match into one of their top three to four programs.” (P15)
“One of the things that I would love to see is that there is a maximum number of signals, regardless of specialty. So, I mean, that’s the hard part is when you are applying to diagnostic radiology. I don’t even know if they’re participating, but they get their five signals with family medicine. That’s unfair to the people who really want family medicine, you know.” (P09)
|
Total multispecialty caps on signals
|
“I do think five is probably a good number. One of the things that I would love to see is that there is a maximum number of signals, regardless of specialty.” (P09)
“I think the biggest drawback to the program signals was that you get the full allotment of signals for every specialty you apply to when you apply to multiple specialties. So, there were at least a few where this really felt like they were applying to OB or dermatology or something else. . . . I would say you need to designate a primary specialty of interest and you only get the signals for that specialty. I’m fine if you want to apply to family medicine as a backup, but you shouldn’t get the same five signals that somebody who’s only applying to family medicine gets.” (P08)
|
Eliminate multispecialty signaling
|
“Maybe folks should only be able to signal one specialty. You know, from a program director standpoint, boy, that sounds great; I understand from a student standpoint sometimes people really are just still trying to figure it out, and I understand that. But I am a little worried about the gamesmanship that students are going to play given this new tool, and I really worried about the fact that somehow we missed in their application the fact that we used a lot of effort trying to identify red flags and not waste that really valuable interview time resource on folks that really aren’t committed to family medicine.” (P19)
“And I don’t know how much I want to actually use it. . . . I think the one issue with signaling is they get a number of signals per program that they’re applying to. So, it doesn’t help us if someone is really trying to apply to, like, radiation oncology, and this is their backup. Because we have no idea if that’s what they’re doing.” (P12)
|
Theme 1. Program faculty adopted the signaling programs strategically to complement their existing application review strategies. Some respondents expressed doubt about the benefits of program signals and chose not to change their interview selection strategy for this first year of implementation, noting “we didn’t use them, like, as a cut off or anything like that” (P15). Other respondents described integrating program signals into holistic review. As one respondent noted, “I think our emphasis was really trying to think holistically, and these were just additional puzzle pieces that helped us figure out how to do that” (P02), explaining that the signal was weighted alongside other elements in a scoring rubric for holistic review. Other respondents described using program signals to prioritize applicants who might be a good fit with their program, noting “[It] has made it easier for us to figure out which applicants are genuinely interested in our program” (P01). Respondents used this data to adjust the priority of application review or interview offer: “I would describe how we went about using the signals as a way to slightly further inform our decisions of who to offer interviews to in the lower tier of applicants” (P20).
Some respondents used signals and preferences to allocate interviews to applicants in the middle or lower tiers of candidates after holistic review, as one respondent explained, “Because we had many more people than we could interview in that group is when we started prioritizing program signaling and geographic signaling over those who did neither” (P20). One respondent described this strategy as a successful way to invite applicants off the wait list: “I matched two residents who came off of my wait list for interviews because they signaled me, and they matched in our program after being ranked fairly high” (P19). One respondent described an unsuccessful strategy: “We were being a little bit generous with some of them to say, ‘Oh, you know they are not the best, but they signaled us, so let’s give him a chance . . . but I think we just have to be a little bit more objective in terms of like, what’s our interview criteria?” (P3), explaining that some candidates who sent a program signal were ultimately not ranked by the program.
Theme 2. Signals were perceived as reducing application volume and burden. As one respondent replied, “The problem folks were trying to solve was trying to identify . . . true interest . . . as opposed to being just part of the application bloat and the wide net that applicants are casting” (P19). Most respondents noted stable or decreasing application numbers, with one expressing relief about applicant quality: “We had perhaps a slightly decreased number of applicants this year, and despite that the quality of the top applicants was exactly the same” (P19). As a result, respondents were hopeful that signals would not only reduce application numbers but decrease the burden of application review, as one respondent described: “I think the interview process is so manpower intensive. It’s a huge number of hours during interview season. It’s always very stressful for our faculty. So, I think if this is developed more it could eventually be used to sort of reduce the number of hours that we’re spending interviewing and recruiting people who are really not interested in coming to us” (P15).
Theme 3. Program signals did not impact diversity and equity, but geographic preferences may benefit community health. Among applicants who sent a program signal, respondents described searching for those who had a clear match with the mission of the program: “And so there’s a lot of them you can’t read anything into. But sometimes you get those kind of cues that somebody has values that align” (P11). In contrast, respondents more frequently appreciated the use of geographic and setting preferences to offer interviews to applicants with local ties, especially when a free-text response explained the applicant’s reasoning in more detail. As one respondent noted, “When people have roots to the area, they just tend to excel better in residency because they have a support system, or they have family nearby, or their partners with them, and they can cope through, like, the harder situations” (P13). Finally, respondents were skeptical that program signals would address larger workforce concerns, adding, “The problem that we are trying to solve is diversity and getting underrepresented folks into our residency. . . . Signaling is not gonna solve that problem for us. But that’s the problem that at least I was trying to solve” (P05).
Theme 4. Modifications to the signals and preferences are recommended to optimize use in primary care specialties. Respondents largely favored a lower number of signals to identify genuine interest, noting, “You’ve only got five of them, and you use one on me. That means something to me” (P18). Some program directors noted that program signaling might increase the burden on primary care programs when excellent candidates apply to family medicine as a backup specialty, occupy a scarce resource in interview allocation, and then match in another specialty, requiring programs to use the Supplemental Offer and Acceptance Program (SOAP) process to fill their program: “I really worried about the fact that somehow we missed in their application the fact that we used a lot of effort trying to identify red flags and not waste that really valuable interview time resource on folks that really aren’t committed to family medicine” (P19). Suggestions for improvement included eliminating multispecialty signaling: “I would say you need to designate a primary specialty of interest, and you only get the signals for that specialty” (P08); similarly, about capping total signals, “One of the things that I would love to see is that there is a maximum number of signals, regardless of specialty” (P09).
Reflecting on their implementation strategies and experiences with signaling and preference programs, we asked program faculty for key recommendations to share with stakeholders (Table 3). Consensus among respondents revealed that most US medical school candidates will match into a top choice. Some respondents suggested that advisors recommend to applicants that signals should be used for reach programs rather than those already familiar with the candidate due to their local ties: “Try to use it if you need to establish some foot in the door or somewhere out there” (P18). Respondents desired dissemination of best practices within the specialty, with a consensus on whether applicants should signal their home programs. Finally, respondents called upon the AAMC to adjust the signaling programs for all specialties. Respondents requested a free-text box for applicants to explain why they were signaling a program and for consideration of interventions to reduce negative impacts of multispecialty signaling on the primary care workforce.
Topic
|
Quote
|
Applicant and advisor
|
“In family medicine, most students, unless they have real pink or red flags, are going to get their top three choices, so it didn’t distinguish that many for us.” (P02) “So I would tell them to honestly just use the signals to the places that they think are their best options, and be honest with their geographic and urban/rural thing—actually reflect where they want to go.” (P17) “I think, to have everybody on the same page, whatever page it is, is the most important thing. But I also think just intuitively, of course, you are likely interested in your home program and if you go to the effort of scheduling a rotation somewhere, you are obviously interested there. So, to me, the most intuitive thing is to not signal those places, but to consider those things essentially equivalent to signals.” (P20) “Spend your five signals on the five that you really care about. If you really wanted to play the game, don’t spend those signals on the ones right next to you that you’re gonna get an interview with anyway. . . . Try to use it if you need to establish some foot in the door or somewhere out there.” (P18)
|
AAMC
|
“I would like to see . . . the AAMC coming out with recommendations because I think that would maybe have a little bit more power. They, of course, couldn’t dictate that that’s how they needed to be used, but perhaps recommending a common strategy amongst all specialties in that regard, I think, would be helpful, so that there’s less confusion.” (P20) “You mentioned that adding a little comment after, why you picked it. That’d be super helpful. I feel like those little spots where they’re able to add a little detail as to why they pick something or, like, in their impact statement, I think, was a nice thing, too. Just to try to [add] another differentiating factor between people. I think that would be super helpful.” (P18)
|
Family medicine programs
|
“It’s hard, because I think anytime we make a change, like, it’s always gonna feel abrupt and like a lot. And it’s a dynamic process. Maybe just having more clarity in messaging or resources for how they intend the programs to use it. Because I think it’s hard if the students have one idea about what this means or like, ‘Oh, should I signal my program? Or is that a waste of a signal?’ All these things that add stress that I think could be clarified. There could have been some, maybe more blanketed messaging about things so that everyone’s more on the same page.” (P18)
|
This qualitative study of family medicine program faculty in the Midwest United States demonstrated largely positive experiences with both program signals and geographic and setting preferences during the first year of their implementation and identified four themes that explained their experience. Program faculty implemented signals and preferences strategically to complement their existing application review strategies. They perceived program signals as having promise for reducing application volume and burden. Perhaps unique to primary care, program faculty prioritize fit with program mission and community health, and they expressed skepticism that program signals would help with these critical primary care workforce challenges. However, they identified free-text comments regarding geographic and setting preferences as an underutilized tool that could also be applied to program signals and challenged AAMC to address negative unintended consequences of multispecialty signaling on primary care.
Our findings align with prior studies showing general program director satisfaction with program signaling among early adopter specialties such as otolaryngology and orthopedic surgery. 17, 18 Our respondents also had largely positive views of geographic and setting preferences, aligning with a previous study showing a greater likelihood of interview offer for applicants who signaled and also had a geographic connection. 19 Given the dearth of qualitative studies on this topic, our results extend what is known by describing specific strategies used to implement these programs. Some best practices identified by respondents included integration of signals and preferences into existing holistic review rubrics and use of targeted stratification for lower tier and wait-listed applicants for interview selection.
Respondents perceived program signals as reducing application volume and burden, but additional data may be needed to quantify the impact of program signaling and to address the need for additional strategies to combat application bloat in primary care specialties. While some specialties have embraced larger numbers or tiered signals as an alternative to application caps, such strategies may not be perceived as necessary for family medicine. 20 Preliminary data on the first year of signal implementation in family medicine (2023–2024) demonstrated an average decrease in applicant numbers per program of 19% and in the average number of programs applied to by an applicant of 14% (55 programs to 47 programs). 21 Further innovation in artificial intelligence technology and tools to filter applications via the AAMC collaboration with Thalamus may assist program directors in further identifying candidates who best match their program mission. 22
Our results identified concerns with multispecialty signaling and offered suggestions for improvement. While program signals intend to identify genuine applicant interest, interviewees noted that for them to discern whether applicants preferred family medicine was nearly impossible, and they expressed surprise after applicants matched into another specialty. To strengthen the primary care physician workforce, our results support further investigation into the impact of multispecialty signaling and model possible changes such as a total cap on signals regardless of specialty or limiting signals to a single specialty. This process will become more complex as obstetrics and gynecology replace their participation in AAMC Electronic Residency Application Service (ERAS) with an independent application platform, the Residency Centralized Application Service (ResidencyCAS). 23
One strength of this study is that our qualitative data provides in-depth insights that are missing from the existing published evaluation data from AAMC. 21 We limited this study to the Midwest states (aligned with the AAMC geographic North Central area) to provide data to support the community of practice of residency programs in the family medicine Midwest organization with building a regional family medicine workforce. This limited the total number of potential participants. We did not ask reasons for nonparticipation among those who declined an interview. Future studies may consider broader inclusion criteria, expanding to family medicine residency programs in other geographical regions and exploring program director rationale for nonparticipation. We enhanced rigor in the study by purposefully sampling to obtain a large variety of different residency program demographics, but our results may not represent the views of all program faculty in different geographic areas. To rapidly disseminate findings, we conducted most interviews before the 2024 match, limiting our ability to correlate our data with match results. Further quantitative and qualitative evaluation of these programs and their impact on the specialty of family medicine after subsequent interview cycles will add insights about best practices in implementation and modification. Finally, while we limited our focus to faculty, the experiences and perspectives of student applicants, advisors, and program coordinators are crucial considerations for policy suggestions.
Program faculty strategically implemented preference signals into holistic review to reduce application review burden. Further modification of the program signal and geographic and setting preference programs should support the unique experiences of primary care residency programs and the need for primary care physician workforce development. Future research should focus more on the impact of refining the signaling programs and the impact on match outcomes and SOAP participation rate.
Presentations
This work was presented at the Family Medicine Midwest Conference, Naperville, Illinois, September 29 to October 1, 2023; and the 2024 Society of Teachers of Family Medicine Spring Conference, Los Angeles, California, May 3–8, 2024.
Conflict Disclosure
Dr Oshman reports stock holdings from Procter and Gamble, Johnson and Johnson, Merck, Abbvie, Abbott and Dupont, outside the submitted work.
Acknowledgments
The authors thank Family Medicine Midwest Scholarly Activity Collaborative with funding and support provided by American Board of Family Medicine Foundation Family Medicine Residency Learning Community Planning Grant.
References
-
Carmody JB, Rosman IS, Carlson JC. Application fever: reviewing the causes, costs, and cures for residency application inflation.
Cureus. 2021;13(3):e13804.
doi:10.7759/cureus.13804
-
-
-
-
Benjamin WJ, Lenze NR, Bohm LA, et al. Evaluating the impact of the novel geographic preferences section on interview rate and residency match outcomes.
J Gen Intern Med. 2024;39(3):359-365.
doi:10.1007/s11606-023-08342-w
-
Mun F, Suresh KV, Li TP, Aiyer AA, LaPorte DM. Preference signaling for orthopaedic surgery applicants: a survey of residency PDs.
J Am Acad Orthop Surg. 2022;30(23):1,140-1,145.
doi:10.5435/JAAOS-D-22-00478
-
-
-
-
Fagan EB, Finnegan SC, Bazemore AW, Gibbons CB, Petterson SM. Migration after family medicine residency: 56% of graduates practice within 100 miles of training. Am Fam Physician. 2013;88(10):704.
-
Irwin G, Nilsen K, Rohrberg T, Nilsen K, Moore MA. Use of signaling in family medicine residency interviewing.
Fam Med. 2024;56(6):381-386.
doi:10.22454/FamMed.2024.678799
-
Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups.
Int J Qual Health Care. 2007;19(6):349-357.
doi:10.1093/intqhc/mzm042
-
Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research.
Adm Policy Ment Health. 2015;42(5):533-544.
doi:10.1007/s10488-013-0528-y
-
Glasgow RE, Harden SM, Gaglio B, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review.
Front Public Health. 2019;7:64.
doi:10.3389/fpubh.2019.00064
-
-
Patton MQ. Qualitative Evaluation and Research Methods. 2nd ed. Sage; 1990.
-
Guthrie ST, Dagher T, Essey-Stapleton J, Balach T. Preference signaling in the orthopaedic surgery match: applicant and residency program attitudes, behaviors, and outcomes.
JBJS Open Access. 2024;9(2):e23.00146.
doi:10.2106/JBJS.OA.23.00146
-
Pletcher SD, Chang CWD, Thorne MC, Malekzadeh S. The otolaryngology residency program preference signaling experience.
Acad Med. 2022;97(5):664-668.
doi:10.1097/ACM.0000000000004441
-
Benjamin WJ, Lenze NR, Bohm LA, et al. Impact of applicants’ characteristics and geographic connections to residency programs on preference signaling outcomes in the match.
Acad Med. 2024;99(4):437-444.
doi:10.1097/ACM.0000000000005551
-
Catalanotti JS, Abraham R, Choe JH, et al. Rethinking the internal medicine residency application process to prioritize the public good: a consensus statement of the alliance for academic internal medicine.
Am J Med. 2024;137(3):284-289.
doi:10.1016/j.amjmed.2023.11.021
-
-
-
There are no comments for this article.