Does America have enough physicians for its population? Are current physicians trained in the specialties and practicing in the locations that Americans need? The Association of American Medical Colleges (AAMC) believes that there are too few physicians, and advocates increasing the number of residency positions in the United States.1 They note the increased number of physicians graduating from US medical schools as a result of the creation of new schools and larger class size among existing schools, and correctly observe that if the number of residency slots is not also increased, we will not actually get more practicing doctors because these students will simply fill residency positions currently filled by international medical graduates (IMGs). However, the AAMC does not take a position on the question of what specialties those new physicians should be in or where they should practice.
In an important 2012 study, researchers at the Robert Graham Center (RGC) of the American Academy of Family Physicians (AAFP) estimated that by 2025 we would need an additional 52,000 primary care physicians, based on three factors: population growth, population aging, and increased insurance coverage (mainly via the Affordable Care Act, [ACA]).2 Additional data available in the RGC’s extensive Chartbook on Primary Care in the United States clearly demonstrate that current physicians are not appropriately distributed by specialty or geography.3 We need more primary care doctors everywhere, but especially in rural areas, where about 20% of the US population lives4 but fewer than 10% of doctors practice.5
Why is it that the specialties chosen by US doctors do not match community needs? While primary care physicians, especially family physicians, provide care for all members of a community and for smaller communities, subspecialists require a larger population to support their practices, and often require specialized equipment. Thus, they generally practice in larger metropolitan areas. On its face, this makes sense; a rational distribution of physicians would have enough family doctors to care for everyone, with the various subspecialists clustered in larger towns and cities, serving a referred population.
In the United States, however, the medical community is about two-thirds subspecialist and one-third primary care, an inversion of the ratio in most other developed countries, and practice location is overwhelmingly in large cities. Why? First, medical students mostly come from upper middle-class families in the suburbs of major metropolitan areas where schools offer more opportunities (because the districts have more money) and so they do better on standardized tests for admission to college and medical school. These students are likely to want to live and practice in a major metropolitan area similar to the one where they grew up.
Second, most medical schools are located in major metropolitan areas, so even students from smaller towns get used to living in the city. In the academic referral centers where the students train, most of their mentors are subspecialists. Faculty also may overtly or subtly disparage primary care. Moreover, the percentage of medical school graduates entering primary care is artificially inflated by counting all those entering internal medicine residencies as “primary care,” although most (75% or more) enter subspecialties with 2-4 years after residency completion.6
The most important factor, however, is probably money. Subspecialists earn more than family doctors. This becomes even more important as students look to the future in the context of their educational debt. This income differential is the direct result of policy decisions to reimburse specialty work at higher rates, particularly for procedures. The primary care/subspecialist pay differential is much smaller in most other developed countries (eg, in Denmark, family physicians earn more than other specialists).7
Subspecialty procedures are lucrative for hospitals, which creates an incentive for those hospitals, the main sponsors of residency education, to invest in specialty residencies. Thus, students choose to enter specialties where they can make more money, and hospitals create residency positions in fields that financially benefit the hospital.8 In America, the health care system is not designed or staffed to produce a healthy population. It is designed to extract resources from the rest of the economy. The resulting collapse of the mental health and public health systems are on full display in the news every day.
In this issue of Family Medicine, Rittenhouse, Ament, and Grumbach demonstrate that “Sponsoring Institution Interests, Not National Needs, Shape Physician Workforce in the United States.”9 That is, given the opportunity to decide in which specialties to create new residency positions, hospitals (the main sponsors of residencies) choose those which benefit the hospital itself rather than those which create the new physicians that the broader community needs. The organizations that we entrust to produce physicians are operating in their own interest rather than those of society. And they do this with resources coming directly from taxpayers. Should we expect them to act differently?
In fact, the US health care system provides many motivations for them to act just this way. For example, in a rational health system one might expect that there are enough beds for the people who need them for any reason, and that hospitals would not unnecessarily duplicate services (eg, hospital A does great orthopedic and pediatric care, hospital B has cardiac and psychiatric excellence). In the United States, however, hospitals compete with one another. But they do not choose to compete in all service areas; they compete most aggressively for lucrative services. Every hospital wants to have cancer, heart, orthopedic, and neonatal intensive care patients (with insurance) because they are well reimbursed. There is little competition for poorly reimbursed services like primary care, psychiatry, or trauma.
One of the administrators interviewed by Dr Rittenhouse et al stated:
Yeah, in family medicine we need more residents, but I’m not going to pay to train residents for [competitors]. I mean, if they’re not going to stay [at our institution], in reality, I would just as soon shrink the program by half.
But if the new residents can make more money for the hospital, that is a different story:
And so the value of the orthopedic program is that it’s lucrative for the hospital… it’s just the derivative benefit of all that hospital surgical care.
But those “competitors” are the rest of the local community, and other communities in the region that do not have training programs and count on academic centers to produce the doctors that they need. To some degree, we have blurred the boundary between teaching and nonteaching hospitals and essentially privatized the production of the physician workforce. So if a hospital does not think it is getting the physician recruitments it wants to get, it opens a training program to address its own needs even if this duplicates programs in the same city. But, short of changing the entire health system, what might motivate these teaching hospitals to change their practices to produce more of the kinds of physicians America needs?
There are no comments for this article.