Histories    |     Facts    |     Archives    |     About Us    |     Contact Us

History of American Higher Education

Pursuing the College Degree

In the United States, college degrees come from many sources with many different perspectives on the nature and function of the degree. According to the Random House Dictionary, college degrees generally refer to “an academic title conferred by universities and colleges as an indication of the completion of a course of study, or as an honorary recognition of achievement.” The concept of post-secondary education is broader than the two- or four-year institutions that are typically thought of as “college.” The concept of higher education includes all the trade, vocational, and career institutes, as well as academic college and university programs offered by thousands of institutions nationwide. But it is the college degree, that ennobled document conferred by the degree-granting post-secondary institutions of the America, that retains certain airs of rights and expectations—especially as the student proceeds through the hierarchy of bachelor’s, master’s, and doctor’s degrees.

As a result of all the “rights and responsibilities” associated with these higher levels of education, and the American ideal of potential for upward economic mobility, the college degree continues to be sought after in ever-increasing numbers across every demographic. According to the National Center for Educational Statistics, 17.5 million people enrolled in degree-granting institutions in 2005, and 20.6 million are projected to enroll in 2016. The total number of degrees conferred in 2006 amounted to more than 2.9 million. By 2017 that number is expected to reach nearly 3.5 million (National Center for Education Statistics [NCES] 2008). But access to higher education has not always been so easy, nor was it originally sought after for the same reasons it is today.

The Early Years

Harvard University is the oldest institution of higher education in the United States. It was founded in 1636, a mere sixteen years after the Mayflower landed at Cape Cod in present-day Massachusetts (Archibald 2002). By the time of the Revolutionary War, there were nine chartered degree-granting colleges established in the colonies, which is remarkable considering the vastly wealthier and more populated mother England had only Cambridge and Oxford to educate her students (Trow 1988). The colonial colleges—Harvard, William and Mary, Collegiate School (which became Yale), Academy of Philadelphia (University of Philadelphia), College of New Jersey (Princeton), King’s College (Columbia), College of Rhode Island (Brown), Queen’s College (Rutgers), and Dartmouth—were, however, modeled upon Cambridge and Oxford and, like their English models, in many cases required religious affiliation. Transplanted Puritan, Presbyterian, as well as Baptist sects variously exercised control over specific schools while William and Mary and King’s College were primarily under the auspices of the Church of England.

The mission and administration of these colleges directed their students toward spiritual studies “in line with the spirit of [the] religious tradition” that accompanied colonial America’s early years (Brickman 1972). But at the time, a college education was fairly exclusive—without financing from England, the costs of operating a university made the price of an education prohibitive for most people. Nevertheless, America was already providing an array of options to that specific demographic of wealthy, white men—most of whom were interested in becoming members of the clergy (Archibald 2002). Consequently, enrollment was small in these universities up through the Revolutionary War. But those men who became educated were among the preeminent leaders in both religious and political arenas, as well as among the new generation of educators in America.

Jefferson’s Vision and the Morrill Land Act

Thomas Jefferson was among the earliest proponents of state education in America (from primary school through college) based on scientific exploration as a pursuit wholly distinct from religious teachings and indoctrination. Though many of his ideas would not take hold until after the Civil War, the American branch of the Enlightenment took hold in the eighteenth century and, along with the early movement toward the development of a state university system, laid the foundation for secularism and expanded human rights in America—though mostly for white males (Brickman 1972). But beyond the practical, scientific method suggested in Jefferson’s ideal higher educational system, Jefferson also championed “the lecture method, the elective system,” free from religious affiliation that would be adopted by the emerging network of colleges across the expanding United States. At the center of his philosophy was the belief “that education should reinforce republican politics by teaching citizens and leaders their rights and responsibilities” (Addis 2003).

Jefferson had even advocated for a centralized university at the top of a pyramid-shaped educational system that promoted a more natural aristocracy. His ideas first took shape in his Bill for the More General Diffusion of Knowledge in 1779 and cropped up again in later incarnations. The chartering of his University of Virginia in 1819 and its eventual opening in 1825 realized some of his ideas, but the largely disenfranchised, forty-percent slave state of Virginia was hardly an ideal space for the vision to reach many demographics, and sectionalism and religious opposition was dominant in the political arena. A key bill passed during the Civil War was a critical factor leading to a reemergence of Jeffersonian ideals during Reconstruction (Addis 2003).

Mark R. Nemec identifies a number of forces combining to support the “emergence of the American university [and] the expansion of the American national state,” but the origins are the Morrill Land Act of 1862 (Nemec 2006). The vision of Vermont Senator Justin Morrill to establish agricultural colleges needed the secession of the southern states and the help of Ohio Senator Benjamin F. Wade to finally pass and get signed into law by President Lincoln. With frequent extensions to the law, eventually 69 colleges were established--though a couple with private support, and many not strictly for agricultural purposes (Archibald 2002). What the Morrill Act did was incite “the coordination and entrepreneurship that would be essential for the formation of research universities” and it lay the foundations for the rapid growth of American higher education. Existing institutions expanded their programs, often into areas of science and technology, building new colleges and disciplines into already preeminent universities. Beyond agricultural colleges and expansions, some revenue generated by the land grants combined with existing federal revenue and private endowments, such as that of philanthropist Ezra Cornell, both to establish state flagship public universities as well as to support private universities (Nemec 2006).

A weak economy at the turn of the twentieth century and through World War I was financially crippling for some colleges and universities and the period saw a decline in the number of institutions. However, the modern research university took shape during the period, in large part due to the work of such men as the outspoken Ralph Waldo Emerson and his friend Charles W. Eliot. Eliot, the forty-year president of Harvard beginning in 1869, was a “towering liberal humanist” who followed in Jefferson’s footsteps by helping foster an elective system in education, as well as grounding the aristocracy in merit and the “competitive excellence” of earning a degree through higher education. Democracy, in other words, fostered an aristocracy based on talent and merit: the Jeffersonian meritocracy. Following that idea, he pushed for college entrance examinations as a basis for admissions (Newfield 2003). Such thinking caused high school to become a prerequisite for higher education. In turn, possessing a college degree became something increasingly sought after by employers (Lazerson 1998). Post-war prosperity and a fresh perspective on higher education caused college attendance to nearly double between 1920 and 1930 (Archibald 2002) while degrees conferred increased at a higher rate from 53,000 to nearly 140,000 (NCES 2008). The next great leap in college enrollment and degrees conferred came when members of the armed forces returned home from World War II.

The GI Bill and America’s Golden Age of Higher Education

Taking advantage of yet another provision of the Servicemen’s Readjustment Act (GI Bill) of 1944, a large number of World War II veterans enrolled in colleges and earned degrees with deferred compensation benefits from the United States government. The GI Bill provided a grant to cover the total cost of a full-time education for as many as three years, and some 4.4 million of the 15 million veterans participating in the GI Bill went to college (Archibald 2002). A large number of people who would not have previously considered higher education entered college as it became the “licensing agency for [middle class] Americans who wanted to enter the professions” (Lazerson 1998).

The Journal of Black in Higher Education has pointed out the GI Bill unfortunately widened the racial gap as opportunities were much fewer for blacks in a post World War II United States still wallowing in segregation laws prior to the Civil Rights era, particularly in the south (2003). More recently, though, with America’s commitment to diversifying the workforce, the Journal further reports that obtaining college degrees has helped blacks narrow the economic gap with whites (2004). Women, however, remain disproportionately lower paid, even as the number of women earning college degrees has or will surpass the number of men at every educational level (NCES 2008).

But it was because of surging optimism about higher education in the first half of the twentieth century that the college degree took the shape of upward mobility that remains consistent to the present day. Moreover, a number of important programs developed by the federal government spurred enrollment through grants and student loans. One of the most interesting programs, the National Defense Education Act of 1958, was the government’s belief in the need for more Americans with degrees in science and engineering as a result of fears that developed when the Soviet Union launched Sputnik the year before (Archibad 2002).

In spite of early funding and optimism, in the 1970s and 80s, the economic return measured against the cost of college peaked. And in the last two decades, inflation has carried the cost of higher education to seemingly prohibitive levels., Yet parents and students continue to sacrifice to enroll in college, often incurring unruly debt because of complex student loans. Consequently, “between 1950 and 1990, the number of colleges and universities almost doubled, from 1,851 to 3,535,” and state and federal spending on higher education has soared. Higher education is, simply put, perhaps the single “most successful industr[y] of postwar America” (Lazerson 1998). Status, students, and optimism fueled the drive to expand resources and facilities and build stronger institutions during this “golden age” of American colleges and universities.

The Twenty-First Century Campus

While distance education is not a new idea, the Internet has helped revolutionize the industry and corrected some of the shortcomings of correspondence education programs that were popular early in the twentieth century. Through collaborating with private contractors and commercial information technology businesses, such as America Online and Onlinelearning.net, universities are attempting to develop a quality experience in distance education online, though David F. Noble suggests the lack of true interpersonal (not just interactive) communication has already seen high dropout rates (2002). However, enrollment at the nation’s single largest campus, the online campus of the University of Phoenix, is more than twice that of the next largest schools, such as Miami-Dade College in Florida and Arizona State University in Tempe. While complete degree-earning programs are available online, programs and professors also extensively implement the Internet as an additional instructional tool outside of the classroom.

The idea has its roots in the University of California at Los Angeles (UCLA) launch of its “Instructional Enhancement Initiative” in 1997 that required a number of programs to have Web sites. This was the first instance of mandatory “computer telecommunications technology in the delivery of higher education” (Noble 2002). Efforts to digitize the world’s vast archives have been spearheaded and supported by universities, and the result is a virtual information experience that fully integrates the world’s knowledge into a seemingly all-inclusive online network. The network of libraries and universities has made information technology a central player in the classroom to the extent that professors find themselves outlining research expectations in their syllabi, given the temptation of the “Google search” method of acquiring information. In the meantime, degrees have become available from an ever-increasing number of sources, both physical and virtual.

The concern over the “automation of higher education” may have been somewhat tempered by improvements in the kinds of technology used in online courses, as well as the commitments of educators and administrators to continue pursuing the high-quality research and education on their campuses. An early advocate in keeping public funding close to the campus in the new digital age was former University of Utah president J. Bernard Machen. In his inaugural address of 1998, he stated that it is on the university campus that students are allowed “the broader, more interactive” experience where “[s]pontaneous debate, discussion, and exchange of ideas… [which] are essential in developing the mind” occurs (Noble 2002). To be sure, the nation’s campuses continue to lure students seeking college degrees and, taken all together, the number of students relying upon online sources for their degrees pales in comparison.

Today, America’s colleges and universities are defined by their multiplicity and diversity, and insistence on equal opportunity (whatever its successes) abounds. The reason is the American system has something to offer to virtually everyone, to some extent now “without having to show evidence of academic talent or qualifications” (Trow 1988). This distinction of accessibility has colonial roots despite the extreme degree of exclusivity in the early years of American higher education. Nevertheless, the early evolution of higher education laid the foundations for the system that gradually emerged and distinguished itself from virtually every other system in the world. Indeed, when Shanghai Jiao Tong University’s Institute of Higher Education releases its annual Academic Ranking of World Universities, as many as 60 American universities are listed in the top 100 universities in the world.

-- Posted April 29, 2008

References

"Academic Ranking of World Universities." 2007. Shanghai Jiao Tong University?s Institute of Higher Education. Accessed: April 1, 2008.

Addis, Cameron. 2003. Jefferson?s Vision for Education, 1760-1845. New York, NY: Peter Lang Publishing, Inc.

Archibald. Robert B. 2002. Redesigning the Financial Aid System: Why Colleges and Universities Should Switch Roles with the Federal Government. Baltimore, MD: The Johns Hopkins University Press.

Brickman, William W. "American Higher Education in Historical Perspective." Annals of the American Academy of Political and Social Science, Vol. 404, American Higher Education: Prospects and Choices. (November 1972): 31-43.

Webster’s New World College Dictionary, 4th ed., s.v. "degree."

"How the GI Bill Widened the Racial Higher Education Gap." The Journal of Blacks in Higher Education, No. 41. (Autumn 2003): 36-37.

Lazerson, Marvin. "The Disappointments of Success: Higher Education after World War II." Annals of the American Academy of Political and Social Science, Vol. 559, The Changing Educational Quality of the Workforce. (September 1998): 64-76.

National Center for Education Statistics. (NCES). 2008. Digest of Education Statistics. Washington, DC: Department of Education. Accessed: April 2, 2008.

--- Accessed: March 30, 2008.

Nemec, Mark R. 2006. Ivory Towers and Nationalist Minds: Universities, Leadership, and the Development of the American State. Ann Arbor, MI: The University of Michigan Press.

Newfield, Christopher. 2003. Ivy and Industry: Business and the Making of the American University, 1890-1980. Durham, NC: Duke University Press.

Noble, David F. 2002. Digital Diploma Mills: The Automation of Higher Education. New York, NY: Monthly Review Press.

"Possession of a Four-Year College Degree Brings Blacks Close to Economic Parity with Whites." The Journal of Blacks in Higher Education, No. 44. (Summer 2004): 24-26.

Trow, Martin. "American Higher Education: Past, Present, and Future." Educational Researcher, Vol. 17, No. 3, (April 1988): 13-23.