twitter share facebook share 2018-05-21 2987

David F. Labaree

Every major performance indicator has America's universities leading the world. Unfortunately for those who seek to emulate the US system, the American model reflects unique historical, political, and financial factors.

In the second half of the twentieth century, American universities and colleges emerged as dominant players in the global ecology of higher education, a dominance that continues to this day. In terms of the number of Nobel laureates produced, eight of the world’s top ten universities are in the United States. Forty-two of the world’s 50 largest university endowments are in America. And, when ranked by research output, 15 of the top 20 institutions are based in the US.

Given these metrics, few can dispute that the American model of higher education is the world’s most successful. The question is why, and whether the US approach can be exported.2

While America’s oldest universities date to the seventeenth and eighteenth centuries, the American system of higher education took shape in the early nineteenth century, under conditions in which the market was strong, the state was weak, and the church was divided. The “university” concept first arose in medieval Europe, with the strong support of monarchs and the Catholic Church. But in the US, with the exception of American military academies, the federal government never succeeded in establishing a system of higher education, and states were too poor to provide much support for colleges within their borders.

In these circumstances, early US colleges were nonprofit corporations that had state charters but little government money. Instead, they relied on student tuition, as well as donations from local elites, most of whom were more interested in how a college would increase the value of their adjoining property than they were in supporting education.

As a result, most US colleges were built on the frontier rather than in cities; the institutions were used to attract settlers to buy land. In this way, the first college towns were the equivalent of today’s golf-course developments – verdant enclaves that promised a better quality of life. At the same time, religious denominations competed to sponsor colleges in order to plant their own flags in new territories.

What this competition produced was a series of small, rural, and underfunded colleges led by administrators who had to learn to survive in a highly competitive environment, and where supply long preceded demand. As a result, schools were positioned to capitalize on the modest advantages they did have. Most were highly accessible (there was one in nearly every town), inexpensive (competition kept a lid on tuition), and geographically specific (colleges often became avatars for towns whose names they took). By 1880, there were five times as many colleges and universities in the US than in all of Europe.

The unintended consequence of this early saturation was a radically decentralized system of higher education that fostered a high degree of autonomy. The college president, though usually a clergyman, was in effect the CEO of a struggling enterprise that needed to attract and retain students and donors. Although university presidents often begged for, and occasionally received, state money, government funding was neither sizeable nor reliable.

 

In the absence of financial security, these educational CEOs had to hustle. They were good at building long-term relationships with local notables and tuition-paying students. Once states began opening public colleges in the mid-nineteenth century, the new institutions adapted to the existing system. State funding was still insufficient, so leaders of public colleges needed to attract tuition from students and donations from graduates.

By the start of the twentieth century, when enrollments began to climb in response to a growing demand for white-collar workers, the mixed public-private system was set to expand. Local autonomy gave institutions the freedom to establish a brand in the marketplace, and in the absence of strong state control, university leaders positioned their institutions to pursue opportunities and adapt to changing conditions. As funding for research grew after World War II, college administrators started competing vigorously for these new sources of support.

By the middle of the twentieth century, the US system of higher education reached maturity, as colleges capitalized on decentralized and autonomous governance structures to take advantage of the opportunities for growth that arose during the Cold War. Colleges were able to leverage the public support they had developed during the long lean years, when a university degree was highly accessible and cheap. With the exception of the oldest New England colleges – the “Ivies” – American universities never developed the elitist aura of Old World institutions like Oxford and Cambridge. Instead, they retained a populist ethos – embodied in football, fraternities and flexible academic standards – that continues to serve them well politically.

So, can other systems of higher learning adapt the US model of educational excellence to local conditions? The answer is straightforward: no.

In the twenty-first century, it is not possible for colleges to emerge with the same degree of autonomy that American colleges enjoyed some 200 years ago, before the development of a strong nation-state. Today, most non-American institutions are wholly-owned subsidiaries of the state; governments set priorities, and administrators pursue them in a top-down manner. By contrast, American universities have retained the spirit of independence, and faculty are often given latitude to channel entrepreneurial ideas into new programs, institutes, schools, and research. This bottom-up structure makes the US system of higher education costly, consumer-driven, and deeply stratified. But this is also what gives it its global edge.

* David F. Labaree is Professor of Education at Stanford University.

Comments