Keywords

Introduction

In this chapter I provide a critical summary of the evolution of higher education in Europe, the US and Canada. Beginning with the medieval university as the foundation of Western systems of higher learning, this account is necessarily incomplete, given the expanse of time covered as well as the fact that Canada’s educational history is currently being “researched, revisited, and retold” (Craft, 2015, p. 190). My purpose is to argue that the academy has paradoxically been both a dominating and liberating force since its inception, imposing conceptions of morality and truth that have shifted over time, behaving itself in unethical ways, while elevating its largely privileged graduates to positions of influence within society and advancing national aims.

Despite credos of truth telling and missions of character development, the integrity of Western higher education has been questionable since its founding. In medieval times through to the seventeenth and eighteenth centuries, it was arguably an instrument of imperial ambition and colonial domination (by Church, monarchy and state). In the 1900s, science became the new religion, and its practice pursued by some “at all costs”, with callous disregard for human rights and suffering.

Preceding my discussion of the development of higher education in Canada, I include a brief account of residential schools. This history sheds light on colonial attitudes towards the role of education in the 1800s. Consistent with recent calls for action resulting from Canada’s Truth and Reconciliation Commission (CTRC), it is time for Canadian academics (beyond post-colonial theorists and Indigenous scholars, who have long-laboured at this pursuit) to critically confront this history, examine higher education’s ethical shortcomings, and catalyze needed change.

In the section that follows, I consider the roots of Western society’s higher education institutions and identify ways in which integrity was understood, advanced and compromised.

The Medieval University

In The Rise of Universities, historian Charles Homer Haskins (1923), provided a fascinating account of Europe’s earliest institutions of higher learning, from the early Middle Ages, when only a very basic treatment of the “seven liberal arts” was provided, “grammar, rhetoric, logic” (the trivium), and “arithmetic, astronomy, geometry, and music” (the quadrivium) (p. 4). These early days—the Dark Ages—were followed in the twelfth century by “a great revival of learning” (p. 4)—an early Renaissance—“chiefly through the Arab scholars of Spain—the works of Aristotle, Euclid, Ptolemy, and the Greek physicians, the new arithmetic, and those texts of Roman law which had lain hidden…” (p. 5). The dissemination of these early works was catalyzed by the translation of texts originally written in Ancient Greek and Arabic, into Latin.

The work of Aristotle (384–322 BC), it is said, “reopened the question of the relation between faith and reason” (McInerny & O’Callaghan, 2018). Aristotle saw “ethical virtues (justice, courage, temperance and so on) as complex rational, emotional and social skills” (Kraut, 2018). Building on “Plato’s central insight that moral thinking must be integrated with our emotions and appetites”, Aristotle proposed that ethical virtue was a state of being, developed during childhood and reinforced through law and threat of punishment, in order to keep destructive psychological forces and vices at bay (Kraut, 2018).

Haskins (1923) argued that this knowledge “burst the bonds of the cathedral and monastery schools” where religious instruction was provided by monks and nuns, and created instead the “learned professions” along with the academic guilds of Paris and Bologna, “our first and our best definition of a university, a society of masters and scholars” (p. 5). These early institutions were primarily for the sons of the wealthy, preparing them for key roles in society as part of the ruling elite, whether in education, medicine, law or the Church.

The Bologna Studium, created in Northern Italy around 1088, is credited as being the “oldest university in the Western world” (Bologna, n.d.). Founded through “the spontaneous and informal initiative of a few students” (Bologna, n.d.), Bologna became a “rich and powerful medieval metropolis, capable of attracting and accommodating hundreds of wealthy young people, who brought with them not only books and servants but also a substantial amount of money” (Bologna, n.d.). Studies focused largely on civil (Roman) law and also the arts, which later influenced the development of universities in Spain and Southern France (Haskins, 1923).

In Northern Europe, church-control of higher education was more common, with Cathedral Schools (such as Notre-Dame), giving rise to the University of Paris, formed in about 1200. By 1231, the University of Paris had four faculties; arts, Canon law (i.e., law laid down by the Christian Church), medicine and theology. While theology was considered “the supreme subject of medieval study” (Haskins, 1923, p. 19), few students reportedly elected to take it, as it was considered more difficult and the books more costly than other pursuits.

Haskins (1923) described the 1200s as “a bookish age, with great reverence for standard authorities, and its instruction followed closely the written word” (p. 28). This included “intensive study of the scriptures, Old and New Testament, and of the summary of Christian doctrine” (McInerny & O’Callaghan, 2018). Philosophy also had a place of significance. So revered was Aristotle’s work on logic, that “it pervaded every subject…” (Haskins, 1923, p. 30), including his contributions on Ethics and Metaphysics; “the character traits that human beings need in order to live life at its best” (Kraut, 2018).

This was also the time of St. Thomas Aquinas (1225–1274), a Catholic Priest who studied at the University of Paris before becoming a Master and occupying “one of the Dominican chairs in the Faculty of Theology” (McInerny & O’Callaghan, 2018). Aquinas interpreted and extended Aristotle’s contributions. As both a philosopher and theologian, Aquinas distinguished between earthly truths (common truths known to all, discovered through human reason, argumentative structure and practical and theoretical science) and spiritual ones (faith based truths, revealed by God and Sacred Scripture). Aquinas defined the four cardinal virtues (prudence, justice, courage, and temperance) as well as the three theological virtues (faith, hope and charity).

Following Aquinas’ death, some of Aristotle’s and Aquinas’ ideas came to be viewed as heretical. In 1277, “the Bishop of Paris, Stephen Tempier, prohibited the teaching of 219 theological and philosophical theses that were being discussed and disputed in the faculty of arts under his jurisdiction” (Thijssen, 2018). With the advent of the “Tempier Condemnation”, anyone “teaching or listening to the listed errors would be excommunicated, unless they turned themselves in to the bishop or the chancellor within seven days, in which case the bishop would inflict proportionate penalties” (Thijssen, 2018). This was just one of “approximately sixteen lists of censured theses that were issued at the University of Paris during the thirteenth and fourteenth centuries” (Thijssen, 2018).

In contrast to the hierarchical governing structure imposed by the Church in Northern Europe, in Italy the earliest universities involved self-organizing student and faculty guilds. Typically away from home from the first time, students were in need of accommodation, food and other supplies. As there were initially no university buildings for classes or residences, they regularly dealt with locals who charged exorbitant rates. Working together, and through threat of boycotting particular communities, the students were able to “fix the prices of lodgings and books” (Haskins, 1923, p. 9). They similarly threatened to boycott the faculty, should they not begin and end their lectures on time, cover the full curriculum, and refrain from absenteeism. Faculty were expected to leave a deposit if they planned on leaving town, to ensure their return. The guilds also levied fines on professors who failed to be interesting enough to a secure an audience of five students per lecture (p. 10).

Haskin’s (1923) research into the sermons of the faculty and other documents (including letters home) provided further insight into life as a student in the thirteenth century:

Some who care only for the name of scholar and the income which they receive while attending the university, go to class but once or twice a week, choosing by preference the [afternoon] lectures on canon law, which leave them plenty of time for sleep in the morning. Many eat cakes when they ought to be at study, or go to sleep in the classrooms, spending the rest of their time drinking in taverns (Haskins, 1923, p. 64).

While the students may have been learning about prudence, temperance and charity from the lectures they attended, accounts of this time report on their general “debauchery”, including drunkenness, gambling, frequenting prostitutes and fighting in pubs and on the streets. Following several student deaths in “a town and gown altercation” (Haskins, 1923, p. 15), the Pope confirmed the authority of the Chancellor to oversee the conduct of students, privileging and exempting the wealthy students from local laws.

Also in Italy, the professors (or masters) similarly formed guilds and later colleges, which provided a “license to teach (licentia docendi)” to graduates who were deemed to have successfully completed their examinations or “disputations”, a comprehensive final oral defense overseen by the chancellor (Haskins, 1923, p. 11). In contrast, in Paris, the Chancellor “alone had authority to license teaching in the diocese and thus kept his control over the granting of university degrees” (p. 14).

Accounts of these examinations provide evidence of some of the earliest instances of academic misconduct, implicating students, faculty and the chancellor alike! According to Haskins (1923), if a student were to fail, “he may be re-examined after a year, or it may be that, through the intercession of friends or by suitable gifts or services to the chancellor’s relatives or other examiners, the chancellor can be induced to change his decision” (p. 46). Students were also encouraged to “write home for more money and give a great feast for his professors; if he treats them well, he need not fear the outcome” (p. 70). Mid-course oral examinations were also subject to misconduct. Haskins observed that during the master’s quiz, “the shaky scholar falls back on his only hope, a place near one who promises to prompt him” (p. 74).

Professors experienced considerable pressure to conform to the “truths” of the day, whether through authoritative doctrine, to which they were expected to be faithful, or the “tyranny of colleagues” (Haskins, 1923, p. 50). According to Del Soldato (2020), “the nature of medieval universities was such that teaching was heavily controlled by authorities, and both metaphysics and theology exercised a strong influence, limiting the number of directions in which scientific theorization could advance.” Yet, Haskins (1923) suggested that few faculty would have opposed such expectations; “Accepting the principle of authority as their starting point, men did not feel its limitations as we should feel them now…He is free who feels himself free” (pp. 55–56).

Within church-run institutions, the expectation of compliance was further reinforced with the advent of the inquisitions, the first of which began in 1231 (the Papal Inquisition). By Papal decree, those accused of heresy could be prosecuted and confessions extracted by torture. By the fourteenth century, even in Bologne, Church control was assured:

As time went by, students lost their autonomy, not only in their management bodies but also in city councils, suffering greater influence from local and papal authorities. Even the teachers, who in the meantime had formed the College of Doctors, had to accept the disciplinary measures imposed from above, and were subject to them even more from the following century, when they became public employees, who were paid with income from trade tariffs. (Bologne, n.d. b)

The Spanish Inquisition (1472–1834) was particularly tortuous. This was followed by the Portuguese Inquisition (1536) and finally the Roman Inquisition (1540s), which established a permanent body—the Congregation of the Holy Office/Congregation for the Doctrine of the Faith—to oversee inquisitions throughout the world. For much of this time, Muslims, Jews and Protestant reformers were particular targets.

About one hundred years after the first inquisition, the Renaissance (1348–1648) got underway. This was an exciting time for academics, as the teachings of many more ancient philosophers and mathematicians were translated, scientific inventions produced the telescope and microscope, and in 1450 the Gutenberg printing press provided an alternative to the lecture, allowing for the broad dissemination of scholarly ideas. Wealthy patrons supported—and thereby influenced—the work of those working in the arts and sciences, particularly if they perceived national interest; “Renaissance lords and patrons often had a particular interest in scientific works and treatises, especially those devoted to subjects of military value” (Del Soldato, 2020).

Some questioned corruption within the Church and its universities, including financial and spiritual abuses by the clergy. This gave rise to the Protestant Reformation, when in 1517, Martin Luther, a professor of moral theology at the University of Wittenberg, Germany, is said to have posted to the door of All Saint’s Church, a disputation—perhaps the first academic poster presentation—the Ninety-five Theses (n.d.), which he hoped would stir debate.

Luther challenged the practice of priests selling indulgence certificates (forgiving sins and reduced time in purgatory in exchange for money). The money was for a capital campaign—the building of St. Peter’s Basilica in Rome. Luther was essentially an early academic “whistleblower”, challenging the unethical way in which money was being raised. Underscoring the strength of the bonds between the university, Church and state, Luther was tried for heresy and excommunicated in 1521. Further, the Roman Emperor declared him an outlaw, “making it a crime for anyone in Germany to give Luther food or shelter, and permitting anyone to kill Luther without any legal consequence” (Ninety-five Theses, n.d.).

The civil wars of France (1562–1598) were a direct result of growing conflict between Roman Catholics and the ever-increasing number of Protestant Reformers. Ultimately, the Edict of Nantes (1598) allowed religious toleration and freedom of conscience for France’s Protestants, the Huguenots (revoked in 1685). Other countries followed suit, with Roman Catholicism having to accept its shared standing. Yet, as the inquisitions continued to unfold, the Churches’ scholars were called upon as expert witnesses against the accused, including at the noteworthy trial of Galileo Galilei (1564–1642). Peers judged Galileo’s academic assertions to be “foolish and absurd in philosophy”, “formally heretical” and “erroneous in faith” (Van Helden & Burr, n.d.).

As a result, Galileo, who today is recognized as the “hero of modern science”, was ordered by the Roman Catholic Church not to “teach or defend” the Copernican theory (Machamer, 2017). Galileo had challenged “Aristotelian categories” and offered in their place “a set of mechanical concepts” which ultimately gave rise to the “scientific revolution” (Machamer, 2017). It took the Church almost 400 years to apologize for its treatment of Galileo (Resnick, n.d.).

Threatened by growing challenge, the Roman Catholic Church underwent a renewal, a Counter Reformation, to reestablish its authority. Improving the quality of the priesthood became a primary strategic concern as was increasing the number of worshippers. So, the Church embraced a new strategy of building seminaries (religious colleges) and sending trained missionaries around the globe. As part of this plan, they called on the Society of Jesus (Jesuits), a Catholic religious order which had been established at the University of Paris in 1534, by St. Ignatius of Loyola. According to the Jesuits of Canada (n.d.), Loyola had written, “It is according to our divine calling…to travel to various places and to live in any part of the world where there is hope of God’s greater service and the help of souls.” The first Jesuits arrived in what is now Canada in the early 1600 s, with the aim of converting the Indigenous peoples.

England’s Early Universities: The Influence of the Church, Monarchy and Slave Trade

The earliest universities in England were similarly influenced by the Church, but also the monarchy and the country’s wealthy merchants, many who made their fortune in the transatlantic slave trade. Here, once again, integrity was both selectively advanced and compromised.

Oxford, the oldest university in the English speaking world, was initially affiliated with the Roman Catholic church, but this was changed in 1535 when King Henry VIII broke from Papal authority and declared himself head of the Church of England. Instruction began at Oxford in 1096 and “developed rapidly from 1167, when Henry II banned English students from attending the University of Paris”(Oxford, n.d.).

Cambridge was created in the 1200s by faculty and students who were dissatisfied at Oxford. The University’s website describes a scholar’s life in Cambridge at the time:

In 1209, scholars taking refuge from hostile townsmen in Oxford migrated to Cambridge and settled there…King Henry III took the scholars under his protection as early as 1231 and arranged for them to be sheltered from exploitation by their landlords. At the same time he tried to ensure that they had a monopoly of teaching, by an order that only those enrolled under the tuition of a recognised master were to be allowed to remain in the town. (Cambridge, n.d., a)

Cambridge’s religious affiliation was central to the purpose of the institution; “Most of the scholars of the University were at first clerks or clergymen, in holy orders of some sort, and expecting careers in the Church or in the Civil Service (as diplomats, judges or officers of the royal household)” (Cambridge, n.d. b). Classes at Cambridge began in parish churches, but eventually the University acquired its own property and buildings, the first being for the Divinity School. The first endowed university teaching post, the Lady Margaret Professorship of Divinity, was funded by the mother of King Henry VII, in 1502. Another Professorship was similarly established at Oxford.

King Henry VIII had significant influence at both schools, issuing a series of injunctions in the mid-1500 s that suppressed Canon Law and scholastic philosophy, favouring instead Greek and Latin classics, mathematics and Biblical studies. These subjects were supported by “Regius” professorships (via Royal patronage) in Civil Law, Divinity, Hebrew, Greek, Physic and Medicine.

Following King Henry’s reign, in 1555, under Catholic Queen Mary I, Roman Catholicism briefly reemerged. Three Anglican bishops were tried for heresy at St. Mary the Virgin, the Church of the University of Oxford, and were burnt at the stake on university grounds for failing to renounce their beliefs. Under Queen Elizabeth I, who reigned from 1558 to 1603, the Anglican Church was reestablished. By 1571, the Church of England’s key doctrine was finalized in “Thirty-nine Articles” and incorporated into the Book of Common Prayer, to which all undergraduate students were expected to commit. Oxford’s motto remains Dominus Illuminatio Mea; ‘the Lord is my light’.

The Renaissance paved the way for the establishment of learned societies. Established in 1660, the Royal Society’s motto is Nullius in verba—take nobody's word for it. The motto expresses “the determination of [Royal Society] Fellows to withstand the domination of authority and to verify all statements by an appeal to facts determined by experiment” (Royal Society, n.d.). The Society held scholarly meetings and supported the publication of important work, including Philosophical Transactions (beginning in 1665), “the oldest continuously-published science journal in the world”, establishing “the important concepts of scientific priority and peer review” (Royal Society, n.d.).

It was also at this time that wealthy benefactors of British universities began to exert considerable influence, supporting student scholarships and capital projects. Tobias Rustat (1608–1694), an alumnus of Jesus College Cambridge, was noted by The Economist (2020), for his “generosity three centuries ago [which] allowed generations of orphans to go to Cambridge and be ordained as Church of England clergymen”. He was also a significant investor in the Royal African Company (RAC), which by 1672 had shipped “close to 150,000 enslaved Africans, mostly to the Caribbean” (The Economist, 2020). In this way, the university benefited financially by the enslavement of tens of thousands of Blacks, while helping elevate British orphans out of poverty and into the ministry.

In 2019, Cambridge acknowledged this history, announcing that it would be conducting a “two-year academic study of how much it benefited from the Atlantic slave trade and whether its scholars reinforced race-based thinking during Britain’s colonial era” (Reuters, 2019).

At Oxford, Christopher Codrington’s (1668–1710) endowment of the library of All Souls College has similarly come under scrutiny. Codrington’s wealth came from one of the largest “sugar plantations worked by slaves in Antigua and Barbados” (Race, 2020). In January 2021, Codrington’s name was removed from the library (but not his statue) and a plaque was installed, commemorating the slaves who had worked on the Codrington plantations (Shaw, 2021). The official statement by the College explained:

rather than seek to remove [the statue] the College will investigate further forms of memorialisation and contextualisation within the library, which will draw attention to the presence of enslaved people on the Codrington plantations, and will express the College’s abhorrence of slavery. (Shaw, 2021)

In May 2021, despite significant public pressure, Oxford’s Oriel College similarly decided not to remove the statue of Cecil Rhodes. A spokesperson stated, “We should learn from our past, rather than censoring history, and continue focusing on reducing inequality” (Race, 2021). In 1902 Rhodes became a benefactor to the College, endowing the prestigious international Rhodes Scholarship program. Rhodes is “considered one of the founders of South African racial segregation who made his fortune from exploiting African mines worked by slaves” (Reuters, 2019).

A Brief History of Higher Education in the United States

What happens in the United States can have a profound impact on life in Canada, including within the modern academy. Our early histories also have much in common. Accordingly, it can be helpful as we attempt to understand issues of integrity in the Canadian higher education context, to consider the evolution of higher education in the US as well. In this account, I highlight higher education’s roles in colonization and the subjugation of Indigenous Peoples, complicity with slavery, evolving views on philosophical thought, approaches to teaching morality and ethics, and the rising influence of positivism and the scientific method, in shaping conceptions of truth. I also share how some US universities are responding to mounting evidence of their moral shortcomings.

Julie Reuben (1996) details the evolution of higher education in the US, suggesting that the idea that higher education should have both a “moral and intellectual” purpose was once commonplace (Reuben, 1996, p. 11). Borrowing from the traditions of Oxford and Cambridge, colonists sought to replicate aspects of British life in the Americas, with some important differences.

In New England, Puritans were a major colonizing force. Opposed to the perceived “idolatry” of both Roman Catholicism and Anglican worship, the Puritans focused on preaching scripture and salvation through Jesus Christ (Knapp, 1998, p. 112). Harvard University, the first higher education institution in North America, was founded by Puritans in New England in 1636, with the primary purpose of training clergy, who in turn were to advance “Christianity to the native peoples” (Knapp, 1998, p. 112) . Named after its benefactor the Reverend John Harvard, Harvard’s Motto, first adopted in 1643, was “Veritas” meaning “truth” (Ireland, 2015).

The Puritans established segregated “Praying Towns” where “any Indian religious idea or practices were viewed as pagan and had to be rooted out” (Knapp, 1998, p. 124). “White Christian hypocrisy” (p. 121) served as a barrier to the Puritans achieving their objective of conversion:

The most embarrassing obstacle to Indian conversion was the continued evidence of the hypocrisy which the Indians witnessed in other white ‘Christians.’ The brutality of the European settlers was a great impediment to the successful evangelization of the native population. (Knapp, 1998, p. 121)

In 1656, the Puritans established an “Indian College at Harvard” (Knapp, 1998). Initially housing classrooms and a dormitory (but no students), the building came to include a printing press and in 1663 Harvard produced a Bible in the Algonquian language (Knapp, 1998, p. 123). In 1817, Isaac Royall, used the proceeds from the sale of “inherited land and slaves in Antigua and in Medford…to fund Harvard Law School, the first law school in the United States”  (Harris, 2020, p. 289).

Between Harvard’s founding and the mid-1800s, 289 institutions of higher learning were founded across the US, 240 of them private (Goldin & Katz, 1999), with many built on the backs of slave labour. One of the most notable for doing so was Yale. Founded in 1702, Yale College’s first endowed professorship (created in 1745)—the Livingstonian Professorship of Divinity—was given by Philip Livingston, who owned four slave ships, trading in people, sugar and tobacco from the West Indies and Africa. Yale later named a “prominent gateway in Branford College the ‘Livingston Gateway’” (Dugdale et al., 2002, p. 4). By 1830, Yale was the largest higher education institution in the US. As at Harvard, much of Yale’s growth was funded from the proceeds from slavery—providing faculty chairs, scholarships and support for the Library. Many prominent university and church leaders of the day were also slave owners, purchasing or inheriting both Indigenous peoples and Blacks (men, women and children).

The first scholarships at Yale were named after Bishop George Berkeley, who baptized his slaves, assuring colonists that doing so would not legally “bestow freedom” (Glasson, 2010). He also sought to convert Native American “savages”, peaceably if possible but otherwise, by capturing and converting their children, taking young boys (up to ten years of age) to Bermuda to reeducate them, separating them from their families and customs, before returning them as missionaries. Yale honoured Berkeley’s “great missionary effort” as recently as 1999 (Hopson, 2021). Berkeley donated his plantation to Yale in 1931, which Charles Handy in turn leased from Yale, creating a scholarship fund for fifty years, for top students studying Greek and Latin (Dugdale et al., 2002 p. 5).

Timothy Dwight (1752–1817), President of Yale and a Congregationalist minister and theologian, ardently defended slavery in the US. He held the Livingstonian Professorship of Divinity and as President, taught senior students metaphysics and ethics. During his tenure, “Yale produced more pro-slavery clergy than any other college in the nation” (Dugdale et al., 2002 p. 12), considerably more than Princeton or Harvard.

In 1831, donors, law faculty and alumni from Yale vigorously and successfully opposed the establishment of a so called “Negro college” in the New Haven community, voting with local townspeople to support a formal motion that “to educate the colored population is incompatible with the prosperity, if not the existence of the present institutions of learning, and will be destructive of the best interests of the city” (Dugdale et al., 2002 p. 17). Newspaper accounts further explained opposition to the proposed college, suggesting that it would have degraded the “town’s public morals” (p. 18) and upset Southern patrons.

One of Yale’s most infamous alumni was John Calhoun (mentored by Dwight), who joined Yale as a student in 1802, before returning to his family’s plantation in the South. Elected to Congress in 1811, Calhoun became a U.S. Vice President and Senator. An ardent advocate for maintaining slavery, he argued that the notion that “all men are born free and equal” was “utterly untrue” (Dugdale et al., 2002 p. 12). In 1933, Yale bestowed top honours on Calhoun, naming a residential college after him (overturned in 2017). It was not until 1854 that the first US college for African Americans—Lincoln University—was established, in Pennsylvania. Twenty years later, the first black student graduated from Yale.

Philosophy in the Age of Reason

The time in which higher education was being established in the Americas is perhaps ironically known as the Age of Enlightenment. While serving as a powerful colonizing force, the academy was considering questions of emancipation. Immanual Kant (1724–1804), identified emancipation as “the process of undertaking to think for oneself, to employ and rely on one’s own intellectual capacities in determining what to believe and how to act” (Bristow, 2017).

By the early 1800s, philosophical frameworks, such as Universalism and Utilitarianism also began to hold sway. Universalism was based on the idea that while people are autonomous, they have a duty to be self-aware and behave in ways that are consistent with morally sound, personal maxims that an individual would want to see embraced by everyone as universal moral laws such as “thou shalt not steal”. Kant, the founder of universalism, advocated for Categorical Imperatives (CI); “an objective, rationally necessary and unconditional principle that we must always follow despite any natural desires or inclinations we may have to the contrary” (Johnson & Cureton, 2016). Embedded within the concept of universalism is responsibility and respect for others.

Utilitarianism, from Jeremy Bentham (1748–1832) and John Stuart Mill (1806–1873), in contrast, proposed that people should pursue the greatest good for the greatest number, or “the morally right action is the action that produces the most good” (Driver, 2014). From this point of view, a principle may be discarded (such as “thou shalt not steal”), for a greater good or utility (such as feeding the hungry). Furthermore, Utilitarianism recognized that such judgements are personal (perception based) and variable; what is considered the greater good, can vary by person and change over time.

Drawing on these philosophical frameworks, in 1842 and 1843, the Yale debating club considered the question, “does the greatest good of the greatest number, justify the continuance of slavery at the South?” (Dugdale et al., 2002 p. 23). Nathanial Taylor, then President of Yale and head of the Yale Divinity School argued for it; the students declined to vote. In 1848 the question was repeated. This time both the students and the President voted in the affirmative. Such questions were used to judge final “disputations”, assuring that graduates held “correct” moral beliefs.

Academics of the time subscribed to the pursuit of the “unity of truth” in which knowledge was seen to have a “moral dimension” (Reuben, 1996, p. 17). Accordingly, the primary purpose of higher education was understood to concern “educating young men to the highest efficiency of their intellectual faculties [emphasis added], and the noblest culture of their moral and religious nature” (p. 22). Offering a comprehensive curriculum, higher education institutions “aimed to train each faculty evenly and in relation to the others” (p. 22). Supporting the integration of these faculties—as at Yale—was a “senior year course in moral and mental philosophy…often taught by the college president” (pp. 22–23). Instruction was normative; “professors laid out students’ proper duties to themselves, their fellow humans, and God” (p. 23).

This was also a time in which the natural sciences, rationality, reason and empiricism came to be revered, mathematical laws began to replace religious edict, and doubt and skepticism replaced faith and superstition. This shift in focus occurred across decades and institutions, including within colleges affiliated with the Protestant church:

Christianity, or more specifically Protestantism, became synonymous with nonsectarian religion because of its conformity to science and its resonance with public sensibilities…Protestantism stood for the cause of freedom and the progress of human history. (Hart, 1999, p. 29)

The concept of “civil society” took on a new understanding at this time, incorporating the tenets of economic freedom and modern ethical theory; “As the processes of industrialization, urbanization, and dissemination of education advance in this period, happiness in this life, rather than union with God in the next, becomes the highest end for more and more people” (Bristow, 2017).

University Reform: The Rise of the Scientific Method and Declining Influence of the Church

The mid to late 1800s saw sustained efforts to transform higher education in the United States. Although “university reformers continued to view piety and moral discipline as one of the aims of higher education”, they also sought to “replace older, authoritarian methods with new ones” (Reuben, 1996, p. 12). Increasing demand for scientific discovery and national advancement through the industrial complex, shaped higher education in the nineteenth century. As demand for specialized scientific and professional training grew, the curriculum became increasingly “saturated” (Reuben, 1996, p. 28), with “encyclopedic” knowledge valued over “mental discipline” (p. 62). Calls for educational reform became viewed as a matter of great national importance, in order to provide professors with the time and equipment needed for sophisticated scientific discovery, to “meet the demands of a modern, industrial society” (p. 61).

Frederick Barnard, who oversaw the transformation of Columbia College from an institution with just 100 undergraduate students studying a comprehensive curriculum, to Columbia University, with full graduate programs and a research agenda, observed that “university reform was unavoidable; if colleges did not change to meet social needs, they would die” (Reuben, 1996, p. 61). Later, Nicholas Butler, who served as president of Columbia from 1901 to 1945, emphasized the importance of higher education being in service to society; “The modern university, like the traditional college, was a servant of society, dedicated to its material and moral improvement” (p. 75).

By the late 1800s, “freedom from church control” became recognized as important for the further evolution of higher education (Reuben, 1996, p. 83). John Hopkins University was established in 1876, “on a nondenominational basis” (p. 84) as was Stanford in the late 1880s, and the University of Chicago in 1890. With this pronounced change, required courses in religious instruction, including moral philosophy and Christianity, began to disappear from the curriculum. Criticized as “too theological” (p. 89), too dogmatic, and incompatible with open scientific inquiry, such courses came to be seen as irrelevant to a modern curriculum; reformers sought philosophers who could inspire rather than preach, and “steer safely between iconoclasm and dogmatism” (p. 90). University presidents bemoaned the difficulty in finding faculty members who could teach philosophical and moral thought in a way that emerging sentiments demanded, seeking “professional philosophers” without church affiliation (p. 92). No longer part of the required curriculum, elective courses in “Elementary Ethics” and the “Philosophy of Religion” appeared in the late 1880s in part to fill this gap (p. 93).

Universities founded by religious denominations (such as Harvard and Columbia) similarly moved away from daily, mandatory chapel attendance, instead valuing “free choice” and weekly multi-denominational services that emphasized “fundamental truths” across religions, as opposed to “denominational differences” (Reuben, 1996, p. 122).

By the early twentieth century, theology as an area of study had become increasingly marginalized; religion was viewed as “having no intellectual content” (Reuben, 1996, p. 113) and biblical scholars began “to tacitly accept the separation between the intellectual and spiritual” (p. 111). The Bible itself became viewed as “a work of literature and the ‘truths’ contained within it seen as ‘poetical’ rather than ‘scientific’ and ‘factual’” (p. 112). Demand for such programs also decreased, as students increasingly regarded programs in religious studies with “indifference” (p. 113).

Morality, Science and the Rise of “Student Life” Programs

Despite the decline of required religious and ethical instruction, in the early 1900’s faculty were still expected to serve as ethical role models: upright moral conduct was treated “as an unquestioned requirement for the job” (Reuben, 1996, p. 194). Speaking in 1912, the president of Stanford opined, “teachers cannot escape responsibility for the moral and intellectual ideals of those under their charge” (Reuben, 1996, p. 194). While academic freedom supported free inquiry, presidents made clear that “moral turpitude” would not be tolerated (p. 195). Normative expectations for “appropriate scholarly presentation” were also enforced (p. 199). “Faculty could find themselves guilty of moral turpitude because they spoke in an unscholarly, undignified, or provocative manner” (p. 199). Faculty were in fact fired for speaking out about politically and morally sensitive matters, and for appearing “disloyal” (to institution and founders) and being publicly disruptive (p. 200).

As one example, in 1917 Wadsworth Longfellow Dana was fired from Columbia for his “opposition to the draft during WWI” (Reuben, 1996, p. 200). Earlier, in 1900, Edward Ross was fired from Stanford, for annoying benefactor Jane Stanford, by “speaking out against Chinese immigration and the use of coolie [sic] labor” (Reuben, 1996, p. 196). Leland Stanford, Jane’s husband, was the president of the Central Pacific railway and had acquired his wealth via the efforts of thousands of Chinese railroad workers. While President Jordan was sympathetic to Ross’ views, he ultimately fired him for “dishonorable behaviour” to the institution. Loyalty to institutions, donors and polite discourse, were clearly valued over personal conscience and social critique.

By the 1920s, science was positioned as the “new religion”, with scientific inquiry becoming associated with morality through its highly disciplined and objective approach; “The ongoing task of scientific investigation required seriousness, diligence, and zeal, which made the scientist’s vocation sacred” (Hart, 1999, p. 34). Logical positivism and the work of the Vienna Circle positioned “metaphysics not simply to be false, but to be cognitively empty and meaningless” (Uebel, 2020). Scientists were increasingly viewed as virtuous truth seekers and the pursuit of knowledge was deemed “morally relevant because it could provide standards for individual behavior and social norms” (Reuben, 1996, p. 133). “Subjected to the powerful but indirect moral discipline of scientific training, students were expected to mature into strong, honest, useful men” (p. 136).

A clear hierarchy emerged on university campuses at this time, with disciplinary specialization and “pure” research being increasingly valued. Academics moved away from pursuing a unity of truth and the moral development of their students, and instead turned to ever more atomistic areas of disciplinary interest, including establishing numerous sub-disciplines. The scientific method became revered above all else, including within the social sciences. It was at this time that “philosophical” became synonymous with “unscientific” (Reuben, 1996, p. 186) and morality was increasingly viewed “as a matter of personal preference” (p. 188). Social science research, it was said, should be ethically neutral—descriptive not evaluative (p. 188), having “no political or ethical prejudices, no preferences, no convictions” (p. 191).

Students, however, were not equally enthused with the new direction; neither the narrowing focus on science nor the poor teaching quality they experienced as faculty dedicated increasing time to scientific inquiry. Faculty recognized that “their professional advancement depended on the quality of their research, not on their position as moral leaders” (Reuben, 1996, p. 209).

By the mid to late 1920s, humanities faculty were proposing a counterbalance to the rise of the sciences, arguing “that all significant human experience was subjective and value-leaden, and that objective, value-free science was not suited to understand it” (Reuben, p. 1996, 217). Literature (including history and philosophy), it was argued, held the potential to provide a “spiritual experience” (p. 220) that could enhance empathy and provide moral lessons (p. 220).

At the same time, research suggested that efforts to develop character may be futile. An influential study by Yale psychologists Hartshorne and May (1928), found that when children were presented with opportunities to lie, cheat or steal in a variety of everyday contexts, there appeared to be little consistency in their actions. This led researchers to conclude that human behavior may be more variable than previously thought, influenced by factors such as risk perception. According to Likona (1991), this brought into question the value of character-focused education.

Higher education institutions, turned away from the curriculum towards the role of faculty advisor to support character development. While this type of position was in place in the majority of US colleges by 1928, it failed to deliver on its promise, as few faculty were apparently interested or effective in the role. The University of Michigan’s dean of students, for example, observed that few faculty “are interested in the personal side of student life and who can afford to give the time and thought which proper handling of the problem requires” (Reuben, 1996, p. 253). Freshman orientation programs were also introduced, which included warnings of “moral dissipation” (p. 255), but largely also proved ineffective, given the “questionable guidance offered by upper classmen” (p. 255).

Further attempts at co-curricular moral influence followed, through the establishment of “student life” programs, including closely supervised residences (with faculty serving as dons overseeing curfews and study time), sports programs, and student clubs, with the aim of fostering “esprit de corps and moral discipline” (Reuben, 1996, p. 261). According to Rueben, “by settling on group cohesiveness as the best source of moral influence, university officials came to equate morality with morale” (p. 264).

With this new focus on positive peer influence, admissions programs also increasingly focused on accepting those judged to already exhibit moral traits, in addition to scholarly achievement (Reuben, 1996, p. 262). In practice, however, including character as an admissions criteria, reportedly aided efforts to prioritize Protestant students and “discriminate against ethnic minorities, including Jews” (p. 264). America’s universities were not immune to rising anti-Semitism.

By the 1930s, “the separation of morality and knowledge came to be seen as a ‘natural’ part of intellectual life” (Reuben, 1996, p. 268). The rise of “logical positivism” led to the privatization of morality, and values became perceived as a matter of personal opinion (Likona, 1991, p. 8). In fact, the term “value judgment” came to refer to inappropriately imposing one’s personal values or ethical beliefs on another (p. 8).

In 1947, following the end of WWII, higher education was seen as an important tool for strengthening a new set of values—democratic ideals. The President’s Commission on Higher Education for American Democracy (1947) promoted the importance of providing students with the “’values, attitudes, knowledge and skills’ that would allow them “to live rightly and well in a free society’”, providing them with “ethical values, scientific generalizations, and aesthetic conceptions” (Hart, 1999, p. 109). This in part “fueled interest in the restoration of ethical and spiritual concerns” including “vociferous calls for a common curriculum that included instruction in values and ethics” (p. 110). The humanities were seen as a natural home for such instruction, including in religious studies, which saw increases in programs and enrolments “between 1945 and 1960” (p. 111).

In 1963, however, religion’s “rightful” place in American education was challenged. A US Supreme Court ruling found that “the practice of Bible reading and prayer in public schools violated the First Amendment and thus was unconstitutional” (Hart, 1999, p. 200). Yet, the court’s opinion, written by Justice Tom C. Clark, acknowledged that “a good education was ‘not complete’ without the study of religion” (p. 201). While private colleges and universities, whether religious or nonsectarian, could continue to provide courses in religious studies, for public institutions, including state funded universities, the situation was quite different. As recipients of tax dollars, and committed to “the separation of church and state”, they were expected to maintain a “degree of impartiality” (p. 203).

Over the next several decades, within the public higher education system, departments of religious studies were further minimized or closed altogether. Yet calls for teaching character development did not vanish altogether, including within elementary schools. Writing in the 1990s Thomas Likona, author of Educating for Character: How our Schools can Teach Respect and Responsibility observed:

Wise societies since the time of Plato…have educated for character as well as intellect, decency as well as literacy, virtue as well as knowledge. They have tried to form citizens who will use their intelligence to benefit others as well as themselves, who will try to build a better world (Likona, 1991, p. 6).

Uncomfortable Truths

As previously suggested, the 1900s were a time when positivism and scientific inquiry flourished, laying the foundation for the US to be positioned as the “global leader in the advancement, development, and production” of science and technology, and resulting in “dramatic improvements to American lives” (The State of U.S. Science and Engineering 2020). Some of the research undertaken during this time, however, was based on unethical and inhumane practices, arguably including advances with respect to the nuclear arms and space race, and medical and psychological research. For a chilling account of select cases of research misconduct see the Research Ethics Timeline compiled by bioethicist David Resnik (n.d.).

As one example, in research on yellow fever, undertaken in the early 1900s, 33 participants were “exposed to mosquitoes infected with yellow fever or injected with blood from yellow fever patients.... Six participants died, including two researcher-volunteers” (Resnik, n.d.).

Another is the horrific Tuskegee Syphilis Study. Sponsored by the Department of Health, Education and Welfare this multi-year research project, beginning in 1932, “investigated the effects of untreated syphilis in 400 African American men from the Tuskegee, Alabama area.” According to Resnik (n.d.), the researchers “withheld treatment for the disease from participants even when penicillin, an effective form of treatment, became widely available”.

In the 1940s, the US government launched a program to develop an atomic bomb, codenamed the Manhattan Project, involving researchers at a number of US universities, including a team of theoretical physicists at Berkeley. The Manhattan Project was seen as vital to American security. Following, the U.S. Department of Energy sponsored the scientists’ research on the effects of radiation on human beings. The participants, “cancer patients, pregnant women, and military personnel” were unaware they were participating (Resnik, n.d.).

The Nuremberg trials of 1947 put the spotlight on atrocities carried out by Nazi doctors and scientists, which gave rise to the Nuremberg Code; ethical rules for engagement with “human subjects”. The Code’s ten items were based on the notion that “certain basic principles must be observed in order to satisfy moral, ethical and legal concepts” including voluntary consent (Holocaust Memorial Museum, n.d.).

Also in the 1940s, Nazi scientists were heavily recruited into American universities and research institutes, including NASA, through a covert government program called Operation Paper Clip (Records of the Secretary of Defense (RG 330), n.d.). As just one example, Dr. Hubertus Strughold, now recognized by Americans as the “Father of Space Medicine” was alledgedly once a senior Nazi official, implicated in obscene experiments on Jewish prisoners at Dachau as well as disabled children at a prominent research institute in Berlin, of which he was director (Lagnado, 2012). After the war, he was appointed “Professor of Space Medicine at the U.S. Air Force School of Aerospace Medicine”, and later “co-founded the Space Medicine Branch of the Aerospace Medical Association”. In 1963, the association created the Hubertus Strughold Award to “recognize excellence in space medicine” (Miller, n.d.). The award was discontinued in 2013, following allegations in the Wall Street Journal (Lagnado, 2012).

In the 1950s and 60s the US government allegedly funded psychological experiments on many American campuses and within university affiliated hospitals, under the auspices of the CIA’s MK Ultra program (Mather, 2020). At Harvard, experiments reportedly involved participants being “bullied, harassed, and psychologically broke [sic] down” (Mather, 2020). In others, hallucinogenic drugs, such as LSD, were given to unwitting subjects, including college students, psychiatric patients and members of the public. The CIA was reportedly interested in learning about brainwashing and torture techniques, and university researchers were core to these efforts.

More recently, unsavoury influencers of US university research have included Jeffrey Epstein (with regard to eugenics) and the Sackler family (with regard to Purdue Pharma’s complicity in the opioid crisis). Writing on Epstein’s influence and privilege at Harvard, Oreskes (2020), observed:

Harvard is not alone in accepting tainted money. Universities need to develop policies to ensure that research funding is based on merit, not cronyism, and researchers who are seeking public trust must be able to show that their own ethical compasses are not deflected by the magnetism of money. (n.p)

Growing concern with research misconduct in the US has resulted in calls to improve oversight of scientific research. In 1992, the Office of Research Integrity (ORI) was established to oversee research conducted by the Public Health Service (PHS). The PHS “provides nearly $38 billion for health research and development, primarily in the biomedical and behavioral sciences” (Office of Research Integrity, n.d. a). Today, in addition to promoting research integrity and developing policy, the ORI monitors investigations, recommends findings and posts details on cases of misconduct, naming the researcher and the institution where misconduct was found (Office of Research Integrity, n.d. b).

Another important area where  “uncomfortable truths” have begun to be addressed, pertains to the complicity of many US universities in slavery. As one example, following a self-congratulatory account of Yale’s history opposing slavery, published for its tercentenary, PhD students Dugdale, Fueser and Celso de Castro Alves (2002), corrected the record. As previously noted, Yale benefited financially from the proceeds of slavery. Senior leaders also taught pro-slavery ideology and along with alumni, undertook efforts to prevent Black’s from participating in higher education.

As a more positive example, in 2004 the University of Alabama apologized, “for the involvement of antebellum Alabama faculty members in punishing enslaved people on campus and promulgating proslavery ideologies” (Harris, 2020). This apology is reportedly the first instance of an American university doing so (Harris, 2020).

In 2017, a memorial was erected at Harvard Law School to honor “the enslaved whose labor created wealth that made possible the founding of the Harvard Law School” (Harvard and the Legacy of Slavery, n.d.). Harvard President Faust (2016) publicly acknowledged the university had benefited financially from “racial bondage” and also called out historians who had “long ignored” this truth; “This is our history and our legacy; one we must fully acknowledge and understand in order to truly move beyond the painful injustices at its core.”

In summary, higher education in the US was largely founded through missionary efforts to impose Western conceptions of civilization and morality on Indigenous peoples, while people associated with this pursuit behaved themselves in unethical ways. Slavery fueled the expansion of higher education institutions in the 1700s and early 1800s, while students who overtly supported slavery—and judged to be men of good character—graduated into positions of influence within so-called “civil” society. By the mid-1800s, universities began to shift their focus, away from character development and the humanities, towards positivist scientific research (some of it highly unethical). The 1900s saw university researchers play an increasing role in scientific advances, contributing to the country’s economic and military dominance. Today, American universities have begun to acknowledge their complicity in the slave trade and systems have been put in place to help hold faculty and institutions accountable for research misconduct.

The Colonization of Canada: Higher Education’s Roots

I now turn to the colonization of Canada, adding a critical view to the brief history presented by Eaton and Christensen Hughes (2022), and including the horrific treatment of Indigenous peoples, particularly First Nations, as well as the Inuit and Métis. I include an overview of the creation and aftermath of residential schools, as their shameful legacy has significant implications for the mandates of Canadian higher education institutions today. I begin with present day facts—in order to provide an essential modern lens by which to view this history.

In 1988 the Canadian government formally apologized for attitudes of “racial and cultural superiority” that led to “a suppression of Aboriginal culture and values” as well as the abuse of students in the residential school system:

The ancestors of First Nations, Inuit and Métis peoples lived on this continent long before explorers from other continents first came to North America…Diverse, vibrant Aboriginal nations had ways of life rooted in fundamental values [emphasis added] concerning their relationships to the Creator, the environment, and each other, in the role of Elders as the living memory of their ancestors, and in their responsibilities as custodians of the lands, waters and resources of their homelands… 

Tragically, some children were the victims of physical and sexual abuse…To those of you who suffered this tragedy at residential schools, we are deeply sorry. (Gathering Strength—Canada’s Aboriginal Action Plan, 1998)

This was followed a decade later by the launch of the Indian Residential Schools Truth and Reconciliation Commission (TRC), which uncovered traumatic truths about the abhorrent treatment of children, from thousands of survivors. Multiple reports and 94 explicit calls for action followed along with the establishment of the National Centre for Truth and Reconciliation (NCTR) at the University of Manitoba, described as “a place of learning and dialogue where the truths of Residential School Survivors, families and communities are honoured and kept safe for future generations” (NCTR, n.d.).

Contained within the NCTR’s archive is Honouring the Truth, Reconciling the Future: Summary of the Final Report of the Truth and Reconciliation Commission of Canada (2015). This document contains the 94 calls for action and makes clear that Canada engaged in physical, biological and cultural genocide in dealing with Indigenous children. NCTR Director of Research, Aimée Craft observed “[we] must rise to the challenge of knowing this history, and continue to acknowledge it while moving towards a new understanding of the relationships we must rebuild” (2015, p. 190).

The first missionaries to arrive in what is now Canada, were French Catholics (Jesuits, Récollets and Ursulines), who settled in New France (Quebec) in the early 1600s. In 1632, the Jesuits “were given a monopoly over missionary activity” particularly for boys (White & Peters, 2009, p. 13.) The Collège des Jésuites followed in 1635, and the Séminaire de Québec, now Université Laval, in 1660. In 1708, the Collège des Jésuites, “opened a hydrography school where they taught mathematics, astronomy and physics to prepare students for jobs as navigators and surveyors” (Galarneau, 2006). The Ursuline nuns, who arrived in 1639, focused on educating and evangelizing girls, Indigenous and French, and later, the daughters of British officers.

In 1670, the Hudson’s Bay Company (HBC) was established, after being granted control of lands surrounding Hudson’s Bay. Their interest was the fur trade. The children of HBC employees and local Indigenous women were educated in a variety of formal and informal ways. While the sons of HBC officials were often sent back to England, for others, local schools trained boys for employment in the HBC and girls as “future wives” (Poitras Pratt, 2021, p. 20). The children of these unions were often given French names, and “by the 1660s governing officials considered them to be French, so long as they were baptized” (p. 20).

The Métis worked in a variety of highly skilled occupations during the peak of the fur trade, including as trappers, guides, and interpreters, and later in ranching organizations, acquiring skills through “a mentoring and apprenticeship system” (Poitras Pratt, 2021, p. 21). Following the Resistance of 1885, a highly suspect “scrip system” removed their rights to lands (Muzyka, 2019). “The landless status of many Métis coupled with extreme poverty”, and the inability to pay property taxes, meant that their children were excluded from attending school, with impacts lasting for three generations” (Poitras Pratt, 2021, pp. 22–23). Others were “taken” to residential school, or enrolled by their parents. Summarizing, Poitras Pratt (2021) offered, “the Métis experience of schooling in a post-Rebellion era was marked by the removal of blended traditional and formal learning traditions into one of partial inclusion into or exclusion from, formal school systems” (p. 23). For Inuit youth, while some attended residential schools in the Northwest Territories in the 1800s, it wasn’t until the 1950s, that “formal European-style education…began on a national scale…with the construction of elementary and residential schools throughout major settlements in the Arctic, including Baffin Island” (McCue & Filice, 2011/2018).

For First Nations, the situation was very different. In English-speaking Upper Canada, Governor Simcoe, who had arrived in 1792, was intent on replicating British society through education. He aspired to open several grammar (public) schools, as well as establish a university. The District School Act of 1807 called for “a Public School in each and every District”:

Their founders had in mind the great English public school, whose curriculum was largely classical and whose benefits were confined to the wealthy. These schools were not in any sense popular schools…those established by the Act of 1807 levied considerable sums in fees. They were designed to educate the sons of gentlemen. They were to prepare for professional life. They were essentially for the benefit of the ruling classes. (Putman, 1912)

In contrast, common schools, legislated in 1816, had assimilation as their central aim. As published in the Kingston Gazette (September 25, 1810):

[O]ur population is composed of persons born in different states and nations, under various governments and laws, and speaking several languages. To assimilate them, or rather their descendants, into one congenial people, by all practicable means, is an object of true policy. And the establishment of common schools is one of those means. (cited in Robson, 2019, para. 11)

Egerton Ryerson

Of Upper Canada’s “founding educational fathers”, one of the most influential was Egerton Ryerson (1803–1882). A Methodist minister, Ryerson helped the Church found the Upper Canada Academy (UCA) for boys and girls in Cobourg in 1836, arguing that such a school was needed to “educate the most promising youth of the recently converted Indian [sic] tribes of Canada, as Teachers to their aboriginal countrymen” (Wilson, 1986, p. 298). Renamed Victoria College, following funding from the British Crown, UCA became a university with degree granting status in 1841. Ryerson was appointed its first principal.

Once UCA became Victoria College, female students were no longer allowed to attend. Despite Plato’s view that “all the pursuits of men are the pursuits of women”, in Upper Canada this view did not hold sway until many decades later.

In 1844 Ryerson was appointed Chief Superintendent of Education for Upper Canada. By this time there were “more than 2,500 elementary schools in Canada West: financed by a combination of government grants, property taxation, and tuition fees” (Gidney, 1982). Ryerson endeavoured to make education accessible to all, believing:

Carried out in a Christian context, education promoted virtue and usefulness in this world and union with God in the next. Because it made good and useful individuals it was also a key agent in supporting the good society, inasmuch as it helped to promote social harmony, self-discipline, and loyalty to properly constituted authority. (Gidney, 1982)

Common Schools were used “as a means of entrenching a certain type of values on the growing Canadian population: middle class, British, and Christian (usually Protestant)” (Robson, 2019, pp. 37–38). Francophones (outside Quebec), Catholics and Irish-famine settlers, were amongst those targeted, and also Blacks and later Asians. “White Canadians reacted negatively to the settlement of Blacks in their communities, often refusing them entry to public schools” (Robson, 2019, p. 32). Similarly, the thousands of Chinese who came to Canada in the 1800s to help build the Canadian Pacific Railway, experienced discrimination and segregation in their children’s schooling.

By 1847, Ryerson had turned his attention to the education of First Nations children (as previously discussed, differing strategies applied to the Métis and the Inuit). Ryerson advocated for Industrial Schools with the objective of creating “working farmers and agriculture labourers, fortified of course by Christian principles, feelings and habits” (Ryerson, 1847, p. 74). Further, he advocated that such schools should be run by religious orders:

The North American Indian cannot be civilized or preserved in a state of civilization (including habits of industry and sobriety) except in connection with, if not by the influence of, not only religious instruction and sentiment but of religious feelings…The animating and controlling spirit of each industrial school establishment should, therefore, in my opinion, be a religious one. (p. 73)

Ryerson (1847) recommended that the government’s role be limited to funding and oversight, with inspections “from time to time” and reports written “one or twice a year” (p. 74). Specifically, he suggested that through its power to withhold funding, the government would avoid “endless difficulties and embarrassments arising from fruitless attempts to manage the schools in detail” (p. 74).

Ryerson also advised the students should be paid a small sum for their work, be taught to keep their own accounts, and be given the money upon leaving school, in order to also learn and apply skills in business. He reflected, “it would be a gratifying result to see graduates of our Indian industrial schools become overseers of some of the largest farms in Canada” (p. 77). This is clearly not what transpired.

With Confederation in 1867, legal responsibility for “status Indians” became a federal responsibility, while “education for non-status Indian, Inuit and Métis youth…became a provincial or territorial responsibility” (McCue & Filice, 2011/2018, p. 7).

Residential Schools and Their Legacy

The Residential School system that ultimately emerged in the late 1800s was heavily influenced by Nicholas Flood Davin, a journalist and politician, commissioned by the government to produce what is now known as the “Davin Report” (Davin, 1879). Following a tour of US institutions, Davin endorsed the “aggressive civilization” policy, inaugurated by US President Grant in 1869, concluding that “day-school did not work, because the influence of the wigwam was stronger than the influence of the school” (p. 1).

Davin acknowledged the negative consequences of contracting out the running of boarding schools to religious organizations; “the children at schools under contract do not, as a rule, get a sufficient quantity of food” (Davin, 1879, p. 2). Like Ryerson, Davin supported religious oversight regardless:

The Indians have their own idea of right and wrong, of “good” Indians and “bad” Indians, and to disturb this faith, without supplying a better, would be a curious process to enlist the sanction of civilized races whose whole civilization, like all the civilizations with which we are acquainted, is based on religion. (Davin, 1879, p. 14)

Perhaps prophetically, Davin (1879), further observed, “the character of the teacher, morally and intellectually, is a matter of vital importance. If he is morally weak, whatever his intellectual qualifications may be, he is worse than no teacher at all” (p. 15). While Davin recommended that “the schools both employ and teach Métis peoples”, participation was officially restricted to “Status Indians” (White & Peters, 2009, p. 17). In practice, however, when convenient to boost numbers, Métis children were allowed to attend.

In 1883, Sir John A. Macdonald accepted Davin’s recommendations and officially created Canada’s residential school system. At this time, four residential schools already existed in Ontario—“The Mohawk Institute (1831), Mount Elgin Industrial Institute (1851), Shingwauk Indian Residential School (1873), and Wikwemikong Indian Residential School (1840 day school, 1879 residential school)” (Indigenous Education in Canada—Chronology, n.d.).

In 1894, through an amendment to the Indian Act, attending school became compulsory for First Nations children (whether day school, industrial school or residential school). By 1900, there were 64 residential schools and “226 federally-funded day schools on reserves” (Canadian Encyclopedia, n.d.). In 1920, attendance at residential schools was further enforced:

Deputy Superintendent General of Indian Affairs, Duncan Campbell Scott, makes attendance at residential school mandatory for every First Nations child between 7 and 16 years of age. This policy was also inconsistently applied to Métis and Inuit children. (Canadian Encyclopedia, n.d.)

This amendment to Indian Act authorized priests, nuns, ministers, police officers, and Indian agents to forcibly seize children, and arrest and imprison parents and guardians who failed to cooperate.

By the early 1900s, the horrific consequences of residential schools were increasingly apparent. Dr. Peter Bryce, who inspected the schools, recorded the shocking conditions he witnessed in his 1922 report, The Story of a National Crime: Being a Record of the Health Conditions of the Indians of Canada from 1904 to 1921. After visiting 35 residential schools, he reported that due to tuberculosis and deplorable conditions, “24%, of all the pupils which had been in the schools were known to be dead, while of one school on the File Hills reserve…75%, were dead at the end of the 16 years since the school opened” (Bryce, 1922, p. 4). Those who did survive had to contend with the life-long negative consequences of inhumane treatment.

Despite this report, Canada’s residential school network continued to grow, and by 1930 included more than 80 institutions, “with an enrolment of over 17,000” (Canadian Encyclopedia, n.d.). It was also about this time that the government turned its attention to the education of Inuit and Métis children. By the mid-1950s residential schools were operating in the Western Arctic and Inuvik. Attendance at these schools remained mandatory until 1969, with closures beginning shortly thereafter. The last to close was the Gordon Residential School in Saskatchewan, in 1996.

The experience of Indigenous children, and those from other marginalized groups, including racial minorities and girls, effectively restricted access to higher education well into the late nineteenth and early twentieth centuries. Further, graduates of Canada’s early higher education institutions, including its seminaries and the faculty they employed perpetuated this discrimination.

Higher Education in Canada: A Brief History

Governed by the Church of England, three “Kings” colleges were amongst the first universities established in Canada (Windsor, Nova Scotia, 1789; York [Toronto], Ontario, 1827; and Fredericton, New Brunswick, 1828). These colleges were residential, tutorial and intended to “bring the ideals of the older English universities to Canada” (Anisef et al., 2015).

By the time of Confederation (1867), Canada was home to 17 degree-granting institutions across the founding provinces (Ontario, Quebec, New Brunswick, and Nova Scotia). Four were nondenominational—Dalhousie, McGill and two former Kings Colleges (New Brunswick and Toronto) while 13 remained church controlled. Enrollments were largely small, with the majority enrolling “about 100 students” (Anisef et. Al., 2015).

Mount Allison University in New Brunswick was the first to accept female students, with Gracie Annie Lockhart earning her Bachelor of Science degree in 1875 (Archambault, 2019). Augusta Stowe was the first to earn a medical degree from Victoria College in 1883. The first female graduates in Ontario were Annie Fowler and Eliza Fitzgerald from Queen’s University, in 1884. An 1876 account from the Queen’s Journal, reflects attitudes toward women’s participation:

We are confident that among people who appreciate the delicate grace and beauty of woman’s character too much to expose it to the rude influences, the bitterness and strife of the world, few will be found to advocate her admission to universities. (cited in Queen’s Encyclopedia, n.d.)

At Kings (Toronto), despite funding from the “Anglican Church’s missionary society” for a “professorship of Indian languages” (Peace, 2016, p. 2), few Indigenous students attended. One exception was a “well-known Mohawk doctor, Oronhytekha, [who] graduated from the school in 1866”. Western University of London, founded in 1879 as another non-denominational school, had a mandate that included “the training of both Indian and white students for the ministry of the Church of England in Canada” (Peace, 2016, p. 1).

In 1868, changes in provincial funding resulted in consolidation and a marked decrease in the number of religious institutions through “federated” colleges; “a Canadian solution to the problem of reconciling religiosity and secularism, diversity and economic pragmatism” (Anisef et al., 2015). In Ontario, for example, Victoria College, St Michael’s College and Trinity College all federated with the University of Toronto, agreeing to “restrict their offerings to the sensitive and less costly liberal arts subjects” (Anisef et al., 2015). Manitoba combined three church colleges to found the University of Manitoba. The Western provinces created a single public university each (Alberta, 1906; Saskatchewan, 1907 and British Columbia, 1908).

By the early 1900s, enrollment at Canada’s now largely secular universities was 6,641 students (from a population of around seven million), with the majority male (89%); “44% of students were in the Arts and Science, while 27% were in medicine, and 11% were in Engineering” (Usher, 2018a). Considerable growth followed. By the 1940’s there were almost 40,000 students, 76% male (Usher, 2018b).

During this time, as in the US, the focus of Canada’s universities began to change, expanding “beyond the traditional fields of theology, law and medicine” and introducing “graduate training based on the German-inspired American model of specialized course work and the completion of a research thesis” (Anisef et al., 2015). During this time, evidence of research that would be considered highly unethical today, can be found.

As one example, in 1943 Donald Ewen Cameron became director of McGill University’s Department of Psychiatry at the newly-created Allan Memorial Institute. Later, Cameron served as president of both the American (1952–1953) and Canadian Psychiatric Associations (1958–1959). His highly controversial research program, which ran until 1965, was alledgedly linked to the CIA’s MKUltra program (Mather, 2020). Many of Cameron’s patients were young women suffering postpartum depression.

Ewen Cameron attempted to erase memories by repeated electro-shock treatments, forcing months of drug-induced sleep, and repeatedly administering LSD to his patients…Many of these patients came to the clinic to be treated for moderate depression and instead were subjected to months of horrific exploitation. (Mather, 2020)

An investigation by CBC’s 5th Estate found that some of the victims successfully received compensation from the CIA, in an out of court settlement, while others received compensation from the Canadian Government. Both settlements were without apology or any admission of liability. While McGill’s Department of Psychiatry website mentions Cameron as its founder, it makes no mention of the controversy (McGill, n.d.).

In the 1940s and 1950s, Canadian researchers studied the effects of hunger and malnutrition on Indigenous children in residential schools, maintaining control groups, depriving children of their daily nutritional allowance for years, in order to “establish a baseline against which to compare the effects”(Owens, 2013). They also restricted preventative dental care in order to assess the effects of nutritional deprivation. Mosby (2013) found that little value came from these studies and led to no positive interventions at the schools at the completion. The devastating toll residential schools took on the health of Indigenous children is well-documented (Wilk et al., 2017).

Throughout the 1960s and 1970s, enrollments and the number of Canadian universities continued to increase (Anisef et al., 2015). The 1960s also saw the introduction of the CEGEP sector in Quebec (Collège d'enseignement général et professionel), and community colleges in other regions, that focused more specifically on skill development and job preparedness. Higher education by this time was viewed as important to both economic productivity and social justice; “a major means of accommodating rising social aspirations and of enhancing the social prospects of disadvantaged social, cultural and regional groups” (Anisef et al., 2015). Given long-standing institutionalized discrimination in Canada’s public schools, however, the achievement of these goals was compromised.

Ethics education “re-emerged in the 1960s in the form of practical and professional ethics education” (Maxwell et al., 2016, p. 2). While medicine was at the fore, specialized ethics courses in business, engineering and teaching followed, along with ethics-focused research centres, journals and associations. The Centre for Bioethics of the Clinical Research Institute of Montreal, established in 1976, was reportedly the first in Canada (Medical Ethics, History of the Americas: III Canada, n.d.).

Reconciling the Past While Recognizing Ongoing Concerns

According to the 2016 Statistics Canada Census, while “First Nations peoples have higher attainment rates than non-Indigenous Canadians in college and the trades”, the university level participation gap “has remained at around 22 percentage points” (First Nations Post-Secondary Education Fact Sheet, n.d.) (First Nations Post-Secondary Education Fact Sheet, n.d.). For First Nations aged 25–64, by 2016 just 15% living on reserve and 23% living off reserved had attained a university-level credential (certificate, diploma or degree), in comparison to 45% of those with non-Aboriginal identity (First Nations Post-Secondary Education Fact Sheet, n.d.). Higher education has been slow to address this gap, acknowledge the impact of residential schools and colonization, and embrace the recommendations of Canada’s Truth and Reconciliation Commission (2015).

Of the 94 calls to action, those that apply (directly or indirectly) to higher education, include (TRC, 2015): closing “educational attainment gaps” (Recommendation 10.i); providing “culturally appropriate curriculum” (10.ii); providing “adequate funding to end the backlog of First Nations students seeking a post-secondary education” (11); creating “university and college degree and diploma programs in Aboriginal languages” (16); and various calls to ensure professionals—social workers, teachers and lawyers—are properly educated, including (28) “learning the history and legacy of residential schools” and requiring “skills-based training in intercultural competency, conflict resolution, human rights and antiracism”.

Colleges and universities across the country are establishing Indigenous student scholarships, faculty positions, research centres and student centres, and are critically reassessing the curriculum, pedagogy and assessment norms (see for example, Lindstrom, G., 2022; Ottmann, 2016; Poitras Pratt & Gladue, 2022), although not always successfully. In one case, Jaris Swidrovich, “the only self-identified Indigenous faculty member in pharmacy in Canada” resigned from the University of Saskatchewan citing feelings of isolation after “an extended series of incidents of racism and discrimination at multiple levels” (Sorokan, 2021). Swidrovich observed, “Verbalized or written expressions of support does not equate to action and is not a measure of an institution’s level of safety for Black, Indigenous, and People of Colour” (Sorokan, 2021).

Scholars are writing on Indigenization and decolonization, providing powerful critiques of Western notions of institutionalized schooling (see for example Poitras Pratt et al., 2018) and making thoughtful and detailed recommendations for advancing and transforming the academy in Canada (Cote-Meek & Moeke-Pickering, 2020; Ottmann, 2017). The mission of the National Centre for Truth and Reconciliation (NCTR, n.d.) at the University of Manitoba is to support this work.

There have also been vociferous calls for renaming buildings and toppling statues that have honoured those implicated in Canada’s colonial past, including following the discovering of the remains of 215 Indigenous children on the grounds of the former Kamloops Indian Residential School in B.C. (Fortier & Bogart, 2021). UBC is considering rescinding the honourary degree it bestowed on Catholic bishop John Fergus O'Grady, a former principal at the school (Kurjata, 2021).

Long-standing calls to remove the name of Canada’s first Prime Minister, Sir John A. Macdonald are beginning to have effect. As one example, Queen’s University recently removed his name from their law school building (Glowacki, 2020). Egerton Ryerson’s legacy has similarly been challenged. While the Ryerson University name remains for now, in 2021 Ryerson announced they were renaming their law school after the Honourable Lincoln Alexander, the first Black person to be elected to “Canada’s House of Commons, to serve as a federal Cabinet Minister and to be appointed as Lieutenant Governor of Ontario” (Ryerson Today, 2021).

At the University of New Brunswick, George Duncan Ludlow’s name has been removed from the law faculty building. Ludlow was the Province’s first chief justice. As the son of a slave trader, he was “one of the last judges in the British Empire to uphold the legality of slavery”. Ludlow was also implicated in the abuse of Indigenous children, through his role as “a longtime member of the board of directors for the Sussex Vale Indian Day School, which contracted out First Nations children as indentured servants” (Bisset, 2019).

In Quebec, the history of James McGill has drawn attention. McGill earned his fortune as a West Indian merchant and personally owned five slaves, two Indigenous children (both of whom tragically died at the age of ten), two black women and a black man (Nelson, 2020). Former McGill faculty member Charmaine Nelson has asserted that despite the “often obvious, direct, and profound connections between the histories of western universities and Transatlantic Slavery” (p. 4), “McGill has not acknowledged, critically examined, or redressed these histories and the anti-black, anti-indigenous racism upon which McGill University was founded” (p. 4).

The Black Lives Matter movement—which many Black Canadian university students and faculty have been at the heart of—has significantly increased awareness of ongoing discrimination and harassment of marginalized groups on Canadian campuses:

prominent young Black Canadian writers and activists, emerging from a white supremacist Canadian university system, are writing and speaking openly about the ways their experiences in higher education have shaped their activism. (Moriah, 2020)

Moriah (2020) recommended two memoirs that critique systemic racism on Canadian campuses: Desmond Cole’s (2020), The Skin We’re In (on his experience at Queen’s) and Eternity Martis’ (2020), They Said This Would Be Fun: Race, Campus Life, and Growing Up (on her experience at Western).

Although representation of female faculty and administrators has significantly improved over the past fifty years, a recent study on the “power gap” suggests gender-based discrimination within Canadian higher education is worse than in other professional domains; “As institutions of higher learning, universities have an added ethical and moral obligation to be equitable in their practices—and yet our analysis shows they have among the worst track record on gender representation” (Doolittle & Wang, 2021).

With respect to research misconduct and unethical administrative practice, Christensen Hughes and Eaton (2022a), identified numerous cases of fraud and plagiarism by faculty as well as national policy changes intended to help strengthen the culture of research integrity in Canada. Recent cases of student misconduct, which appear to be growing in frequency and complexity, were also identified (Christensen Hughes and Eaton, 2022b). Taken together, these issues point to troubling and enduring aspects of Canada’s colonial legacy as well as growing concern with faculty and student misconduct.

Higher Education’s Clarion Cry for Change

In bringing this chapter to a close, I first want to acknowledge that I fully recognize I have just briefly identified, and then woven together, a number of highly sensitive and complex topics, each deserving considerably more in-depth treatment than has been possible here. I also recognize my own biases and limitations. As a white woman of British heritage, with a disciplinary interest in education and organizations, much of the literature I have reviewed for this chapter is outside my traditional areas of focus. I am grateful for the thoughtful input of reviewers and look forward to further critique.

The argument I have sought to make throughout this chapter, is that Western higher education, long positioned as a bastion of integrity and truth telling, has a highly questionable and arguably–disturbing–past. In its earliest days, conceptions of truth and scientific discovery were influenced by the wealthy and controlled by the Church. In North America, higher education was a powerful instrument of colonial oppression, imposing self-serving conceptions of morality and truth, while reinforcing dominant social structures, including slavery (from which it profitted). In Canada, residential schools resulted in the “physical, biological and cultural genocide” of Indigenous children (Honouring the Truth, Reconciling the Future: Summary of the Final Report of the Truth & Reconciliation Commission of Canada, 2015). With the rise of the scientific method, a “truth at all costs” mentality took hold in some quarters, with horrific consequences. More recently, the academy has been called out for inappropriately bestowing naming honours on people central to colonial legacy, misconduct in research practice, and being an inhospitable place for BIPOC and female students and faculty. Student misconduct is an additional area of concern.

At the same time, higher education has undoubtedly produced many social benefits. It has developed philosophical thought and reason, helped found the professions, advanced the arts and humanities, fueled scientific achievement, and supported the career aspirations of its graduates. For many Indigenous Peoples, higher education is viewed as “the new means of survival, and it is also the means to achieve individual and collective self-determination” (Ottmann, 2017). Today, Canada’s higher education institutions are beginning to engage in processes of Indigenization and reconciliation. Acknowledging this paradox—while essential—can be exhausting, particularly for those who have suffered and who are committed to helping advance integrity, equality and justice.

Fortunately, recent global initiatives can provide guidance. As one example, the United Nations Organisation for Economic Co-operation and Development produced a report on the Future of Education and Skills 2030 (OECD, 2019), which acknowledged that attitudes and values are “integral to individual and social well-being” (p. 6). In this report they explicitly identified “core shared values of citizenship (respect, fairness, personal and social responsibility, integrity and self-awareness)…in order to build more inclusive, fair, and sustainable economies and societies” (p. 2). One important recommended action for academic leaders is to ensure that these values are explicitly embedded as learning outcomes in university curricula, as well as in the selection criteria for administrators, faculty and staff.

These values are also reflected in the United Nations’ Sustainable Development Goals and 2030 agenda. SDG #4 calls for “Ensuring Inclusive and Equitable Quality Education”. Recently, the Council of Ministers of Education Canada (CMEC) produced a report on Canada’s commitment to SDG #4 (CMEC, 2020) and progress on each of SDG #4’s seven targets. For Target 4.7, Global Citizenship and Sustainable Development, they reported that Canada is working on developing a “shared vision of the competencies needed for the twenty-first century…referred to as global competencies” (p. 38):

More than any other target, Target 4.7 touches on the social, humanistic, and moral purpose of education…Global citizenship education fosters respect for all to build a sense of belonging to a common humanity… (CMEC, 2020, p. 38)

As one modest example of what addressing these issues might look like, a recent advertisement for a research seminar on “Diverse Perspectives on Knowledge Mobilization” at the University of Guelph offered:

We have long known that we need to mobilize research knowledge more creatively if we wish to tackle 'wicked' environmental problems and put research into practice. Increasingly siloed disciplines, a disconnect between arts and science, and a lack of engagement and equity in academia and society jeopardize our capacity to collaboratively respond to environmental crises in a creative, innovative, and equitable way…diversity and inclusion perspectives can advance the way we think, do, and mobilize interdisciplinary environmental research (personal email, 2021).

Canadian faculty are beginning to engage in interdisciplinary ways, and diverse voices are beginning to be heard.

In closing, it is time for Canada’s higher education institutions to fully embrace the promise of higher education that was expressed—if not enacted—in North America centuries ago. Veritas. For if the truth cannot be found here, where can society turn, for solving its most profound problems? Academic integrity must transcend discussions of student misconduct. Shared values of citizenship, the pursuit of social justice (including for Canada’s Indigenous Peoples), and contributing to the development of “inclusive, fair, and sustainable economies and societies” should be at the core of our purpose and practice. This is the challenge of our time. Meeting it is a question of integrity.