The Myth of Efficiency: Technology and Ethics in Industrial Food Production
- First Online:
- Cite this article as:
- Stuart, D. & Woroosz, M.R. J Agric Environ Ethics (2013) 26: 231. doi:10.1007/s10806-011-9357-8
- 683 Views
In this paper, we explore how the application of technological tools has reshaped food production systems in ways that foster large-scale outbreaks of foodborne illness. Outbreaks of foodborne illness have received increasing attention in recent years, resulting in a growing awareness of the negative impacts associated with industrial food production. These trends indicate a need to examine systemic causes of outbreaks and how they are being addressed. In this paper, we analyze outbreaks linked to ground beef and salad greens. These case studies are informed by personal interviews, site visits, and an extensive review of government documents and peer-reviewed literature. To explore these cases, we draw from actor-network theory and political economy to analyze the relationships between technological tools, the design of industrial production systems, and the emergence and spread of pathogenic bacteria. We also examine if current responses to outbreaks represent reflexive change. Lastly, we use the myth of Prometheus to discuss ethical issues regarding the use of technology in food production. Our findings indicate that current tools and systems were designed with a narrow focus on economic efficiency, while overlooking relationships with pathogenic bacteria and negative social impacts. In addition, we find that current responses to outbreaks do not represent reflexive change and a continued reliance on technological fixes to systemic problems may result in greater problems in the future. We argue that much can be learned from the myth of Prometheus. In particular, justice and reverence need to play a more significant role in guiding production decisions.
KeywordsFood processingFood safetyTechnologyReflexive modernizationGround beefBagged salad
We have allowed our clever technologies to determine the shape of our food industries and food safety strategies… If we are to survive, we will need to develop a deep ecological understanding of ourselves and our place on this planet and let that determine the shape of our food production system, and the technologies needed to keep it safe and secure. (Waltner-Toews 1991: 56)
Industrialization in food production has involved the application of new technologies including extensive mechanization. An examination of trends in food processing reveals that companies have strategically applied technological tools to enhance economic efficiency. These efficient food processing systems focus on increasing production while reducing investments in labor and capital, thereby increasing profitability. New technological tools have been a central part of increasing the economic efficiency of food processing. Systems designed around the use of new technologies have resulted in unprecedented levels of standardized, processed food. While technology in food processing has allowed for certain advances, there are also increasing harms associated with industrial processing systems.
The industrialization of food production, with narrow aims to increase economic efficiency, has resulted in negative impacts to society (e.g., Culp et al. 2008; Nyachuba 2010; Tusseau-Vuillemin 2001). In this paper we will closely examine a specific example: how high-efficiency food processing systems designed around new technologies may be linked to large-scale outbreaks of foodborne illness. Recent recalls and outbreaks linked to ground beef, leafy greens, peanut butter, and eggs illustrate that foodborne illness has become a prominent externality associated with industrial food processing. Because food companies set the terms for food production, we argue that they have a social responsibility to protect public health. However, repeated widespread outbreaks reveal the need to examine this responsibility and how narrow goals shape the use of technology and the design of industrial production systems.
In this paper, we draw from several themes and theoretical approaches to explore the linkages between industrial processing systems and recent cases of foodborne illness. The Greek myth of Prometheus will serve as a thematic guide to explore reverence and justice in the application of technology for food processing. We draw upon actor-network theory (ANT) to highlight the complex relationships and unintended consequences associated with industrial food production and the need for reverence towards non-human actors in these systems. In addition, we use the work of Ulrich Beck (1992) and political economy to examine injustices associated with capitalist food processing. We find that while the negative impacts of industrial food processing have become more obvious (e.g., more outbreaks of foodborne illness), responses continue to prioritize industrial goals over consumer well-being.
While we explore the use of technology in this paper, we do not intend to suggest that technology in food processing is inherently problematic. We only wish to illustrate that systems using industrial technologies to serve large numbers of people over a large geographic area should be designed with caution and have the flexibility to adapt when problems arise. Applying technological tools with a narrow focus on economic efficiency has resulted in production systems with significant potential for negative social impacts. In this paper, we illustrate how technological innovation in food processing has allowed for high-volume, concentrated production systems that inadvertently increase widespread foodborne illness. In addition, we argue that the adoption of technological fixes, or applying additional technological tools to address problems without fundamentally changing production designs, may result in even greater problems in the future.
The Ethical Use of Technology: Prometheus Revisited
The ethical use of technology has long been an issue. How do we reap the benefits from new innovations while minimizing associated harm? What types of rules or moral guides should be used to direct our use of technology? Although not using these same terms, similar questions were explored in the ancient Greek myth of Prometheus. Today, Prometheus is often referred to when discussing technology: those who love or have great faith in technology are often called “Prometheans.” However, the specifics of the story are less often discussed. Here, the myth of Prometheus is summarized to help frame current issues related to the use of technology in industrialized food production.
As told by Aeschylus and Plato, the story of Prometheus begins with the creation of the animals and humans by Epimetheus, Prometheus’ brother. As Epimetheus created the animals he gave each one specific tools for survival. However, when he created humans (who were last to emerge) he had nothing left to give them, leaving them naked and defenseless. Prometheus had great sympathy for this pitiful species and stole from heaven fire and technology, giving these gifts to the human race for its survival.
A key point in this story is that these gifts of fire and technology did not result in a better life for the humans. The human race was wrought with division, struggle, and violence. What they lacked was more important than technology; they lacked civic wisdom. Prometheus’ gifts were incomplete because technology alone without the civic virtues of justice and reverence would only lead to destruction. Without justice there is inequality and without reverence there is hubris, a quality that leads to downfall and disaster in numerous Greek myths. This myth provides important ethical insights regarding the application of technologies to redesign food production systems.
In this paper we use the story of Prometheus to guide our exploration of the application of technology in industrialized food processing. In this case we define reverence as respect, piety, and humility towards non-humans and our relationships with non-humans. As described in more detail below, ANT offers a more symmetrical approach to study both humans and non-humans in food systems. Examining food systems through ANT provides a way to understand and redesign systems with more reverence towards non-human actors, such as foodborne pathogens. We define justice in terms of distributive justice: fairness in the allocation of wealth, power, and resources in society (Rawls 1971). We explore this notion of justice in our case studies through engaging with the work of Ulrich Beck and political economy to highlight the negative impacts associated with profit-oriented food production systems, who benefits from these systems, and how responses to problems continue to prioritize the profitability of food companies over the health of consumers. This injustice is further supported by industry-friendly state policies. With these concepts in mind we present our theoretical approach in more detail and then examine specific case studies. In these cases, we will focus on the application of technologies in food processing systems and the use of additional technologies to solve emergent problems, while reflecting on how the absence of reverence and justice continue to shape current problems with foodborne illness.
To explore relationships between food production, technology, and pathogenic bacteria, we draw from actor network theory (ANT). ANT, as described by Latour (1987) and Callon (1999), has emerged as an important theoretical tool to help dissolve the nature/society dichotomy in the social sciences. By using networks, this approach highlights the long overlooked connections between humans, other living things, and objects. ANT shows that despite a history of “purification” (i.e., the separation of humans and nonhumans) and attempts to ignore non-humans, the world is increasingly made of hybrids: quasi-objects that blur the lines between human/non-human and nature/society (Latour 1991). These quasi-objects represent hybrid creations that are linked to the social world. New quasi-objects such as the ozone hole, global warming, and mad cow disease are both created by and impacting society. In many cases these quasi-objects represent the side effects of technological innovation and modernization.
ANT provides a way to explore how technologies and production systems enroll participants and reshape relationships. Some networks are created for specific purposes (e.g., food production) with actors enrolled by network facilitators (e.g., designers of high-efficiency processing facilities). Some of these networks may contain uninvited participants, or actors, that result in unintended consequences. Using ANT to explore these relationships illustrates how some linkages are sometimes both unavoidable and uncontrollable. Callon’s (1999) example of “dissident” scallops, which would not perform as scientists had planned, illustrates how networks can fail to function as intended. Networks can become unstable and result in unexpected consequences.
ANT has been introduced as a way to better understand relationships between humans and non-humans in food systems (Busch and Juska 1997; de Sousa and Busch 1998; Goodman 1999, 2001; Gouveia and Juska 2002). For example, seeds, plants, soil, bacteria, and animals represent important non-human actors in agriculture (Marsden 2000). Busch and Juska (1997) use ANT to help understand globalization and Canadian rapeseed production, illustrating how networks of humans and non-humans are constructed and the redistribution of power. ANT has also been used to explore how people and things interact within soybean production in Brazil, highlighting the important role of active nature in food systems (de Sousa and Busch 1998). While groups of humans attempt to organize networks for food production, non-humans continue to play an important role in shaping network relationships. This can result in unexpected consequences or new quasi-objects in the food system, such as bovine spongiform encephalopathy (BSE) (Goodman 1999). Regarding industrial food processing, as the adoption of new tools and designs results in more unanticipated quasi-objects (e.g., new pathogens), networks may need to be redesigned or face instability.
With certain similarities to ANT, Beck’s (1992) theory of the risk society argues that we can no longer ignore the consequences of modernization. As Latour (2003) points out, both ANT and risk society theory illustrate how the impacts of actions come “back with a vengeance.” Beck (1992) discusses “latent side effects” and undesirable consequences linked to the use of technology. He describes a “boomerang effect” where actions come back to haunt us in unforeseen ways. In Risk Society (1992), Beck uses industrial accidents and nuclear technology to illustrate the risks associated with modernization. In these cases, society is the victim as well as the perpetrator. Beck highlights the role of single-minded capitalist production: “in the effort to increase productivity, the associated risks have always been and still are being neglected” (Beck 1992: 60).
As these side effects become increasingly visible, Beck calls for a turn towards “reflexive modernization” (Beck 1992; Beck et al. 1994, 2003). Reflexive modernization involves reshaping systems to address side effects and an acknowledgment that “mastery is impossible” (Latour 2003: 36). It involves acknowledging that we cannot rely on modern tools to solve problems, but must experience systemic changes in perspectives and institutions (Beck et al. 2003). This means that systems need to be adapted and restructured as problems arise. In terms of networks, reflexive modernization infers that we must make new networks and break down old boundaries. In a reflexive society, the individual is no longer an unchanging subject, but is now an adaptive “quasi-subject” who is “the result as well as the producer of its networks” (Beck et al. 2003: 25). The quasi-subject is no longer in control, but is a reflexive responder to new conditions.
In addition, we draw from political economy to highlight how the quest for profitability above all can result in negative outcomes. Political economy has made substantial contributions to agriculture and food studies (Friedland 1991), including work on commodity chains (Friedland et al. 1981) and exploring the industrialization and globalization of food systems (Bonnano et al. 1994). This work has highlighted both the social problems and environmental consequences (Adam 1999) of capitalist food production. As capitalist oriented firms make decisions to maximize profitability and minimize costs, these short cuts can result in negative impacts to consumers. We also highlight the role of the state in supporting industrial goals, in a “growth coalition” to continually encourage the expansion of capital accumulation, through the “treadmill of production” (Schnaiberg 1980). We contend that prioritizing economic growth above all is not sustainable, as problems will arise that threaten production processes (O’Connor 1994).
Drawing from these theoretical insights, we now examine how high-efficiency technologies have reshaped food processing. Despite attempts to emulate industrial manufacturing, food production still involves complex relationships between humans and non-humans. Here, we draw from ANT to highlight non-human actors (pathogenic bacteria) emerging within industrialized food systems. New technologies and system designs have reshaped relationships between actors, resulting in negative outcomes. The role of non-humans in industrial food systems has been largely overlooked and we argue that this lack of reverence has become problematic. Drawing insight from the work of Beck and political economy, we explore these outcomes and associated responses. While the impacts of industrial systems have become increasingly visible, it remains difficult to find evidence of reflexive responses. Instead we see a continued reliance on technology to solve problems, while industry and the state continue to prioritize profitability at the expense of consumers and others. Illustrating great injustice, industrial actors accumulate wealth while more consumers are subject to illness. To illustrate these trends, we present examples of two commodities associated with foodborne illness: ground beef and packaged salads. Our goal is to highlight relationships; identify outcomes, winners, and losers; and to explore how responses could be more reflexive. It is not our intention to suggest that modern food processing be abandoned, but to problematize current approaches with narrowly focused, profit-oriented goals.
Research methods to inform our case studies include personal interviews, visits to processing facilities, and extensive literature reviews. Regarding the ground beef case study, visits were made to small and large-scale slaughter and processing facilities in Michigan (2007 and 2008) and in Alabama (2008). Interviews were also conducted during this time with facility operators and staff responsible for regulatory compliance, as well as a regional food safety director. In addition, multiple discussions with key informants took place between 2007 and 2011. For the salad greens case study, interviews were conducted between 2006 and 2009 in the Central Coast region of California, which produces a large portion of US salad greens. Respondents included salad processors, food safety auditors, government employees, producers, processors, and representatives from food and agriculture organizations. In each case, data was analyzed qualitatively to identify findings and key themes in responses. For each case study we also conducted extensive literature reviews of peer-reviewed journal articles, newspaper and magazine articles, and government and business reports.
Efficient Food Processing: Industrialization and Foodborne Illness
Food processing in the US experienced significant structural and technological changes following World War II. A shift towards large-scale, low cost production models did not occur by accident. Food-processing companies made strategic changes that reduced costs and increased production. Within production networks, decisions were made that increased surplus labor value. To increase productivity, new technological tools were applied and networks of humans and non-humans were reorganized around these tools. This led to mechanization, standardization, and an overall industrial re-working of food production systems. These new technologies reduced the need for human labor, lowering labor costs and increasing profits. In addition, value-added foods were especially created to target consumers coping with widespread changes in household dynamics. The rise of women in the workforce and dual income households created a growing market for time and energy saving products (Levenstein 1988; Goodman and Redclift 1991).
As investigated by Schlosser (2002), the rise of industrial-scale food processing can be linked to the Fordist production designs adopted by McDonald’s restaurants. In 1948 the McDonalds brothers, who ran a drive-in in San Bernardino California, reshaped their business design. They fired most of their workers and adopted a high-volume, low-labor, and low-cost system where they could produce large quantities of cheap burgers. They needed only a few workers at stations in the restaurant to repeat the same task, analogous to a factory assembly line. By cutting labor costs and increasing production, they were able to sell burgers at half their original price. Restaurant owners from other states traveled to San Bernardino to learn and copy the McDonald’s system. This system represented a new efficient model where an assembly line structure led to increased productivity, massive cuts in labor, and cheap uniform food products (Schlosser 2002).
This model increased profits leading to great success for McDonald’s and was quickly adopted not only by other restaurants but also by food processors. Processors have taken this system further by replacing human laborers throughout the production line with technological devices such as washing, mixing, and chopping machines capable of processing large quantities of food rapidly. Throughout the food system labor has been reduced, mechanization has increased, and productivity has grown exponentially.
However, while the designers of these systems may feel they maintain control over production, they now experience increasing challenges controlling the nonhumans that are brought in and shaped by the system. These non-human actors include bacterial pathogens, such as Escherichia coli (E. coli) and Salmonella, which have been linked to large-scale outbreaks of foodborne illness (see Center for Food Safety and Applied Nutrition 2009). Foodborne illness has long been an issue plaguing society, with small-localized outbreaks linked to pre-industrial food production. However, as we will illustrate below, industrialization has served to greatly amplify the potential impacts of contamination events. While efficient production systems use technology to produce large quantities of food, the unwanted side effects of mass production are becoming more difficult to ignore.
To explore these relationships, impacts, and associated responses we will first examine the ecology of E. coli O157:H7 and then explore two commodities linked to E. coli O157:H7 contamination: ground beef and packaged salads. For each commodity we will explore how technology was applied to aid industrialization, how E. coli O157:H7 emerged as a foodborne pathogen, and responses to outbreaks. ANT serves to highlight the role of E. coli O157:H7 and relationships between E. coli O157:H7, technology, and production designs. The work of Beck and political economists provides a more critical look at how profitability guides the use of technology and the design of production systems and how the state has allowed or encouraged industry growth despite negative impacts to society. We find that current responses to E. coli O157:H7 outbreaks fail to acknowledge (and respect) the ecology of bacteria in food production and continue to perpetuate injustice through prioritizing profitability over consumer well-being.
E. coli O157:H7
To understand the relationships in current outbreaks linked to ground beef and salad greens we must first examine the ecological characteristics of E. coli O157:H7. E. coli O157:H7 is a strain of E. coli bacteria associated with many recent cases of foodborne illness. Unlike the other strains of E. coli found commonly in the human gut, this strain produces deadly Shiga-toxins. When ingested, E. coli O157:H7 can result in bloody diarrhea, hemolytic uremic syndrome (HUS), and kidney failure. While only 3–5% of those with HUS die, survivors often face long-term medical problems (Boyce et al. 1995).
After several cases of illness associated with the consumption of fast food restaurant hamburgers, E. coli O157:H7 was recognized as a pathogen in 1982. In 1993, ground beef contaminated with E. coli O157:H7 led to 732 cases of illness in the US, of which 55 developed HUS and 4 died, all of whom were children (Bell et al. 1994). The number of outbreaks of E. coli O157:H7 has continued to rise. There are an estimated 20–73,000 cases of illness and between 61 and 250 deaths from E. coli O157:H7 in the US annually (Boyce et al. 1995; Mead et al. 1999). Of the reported cases of illness from E. coli O157:H7, 41% are linked to ground beef and 21% are linked to fresh produce (Rangel et al. 2005). Outbreaks have been attributed to advances in identification, but scientists have also cited other factors including increased consumption of meat and produce and the industrialization of food processing (Beuchat and Ryu 1997; Tauxe 1997).
Cattle remain the primary source of E. coli O157:H7 contamination. While this strain of E. coli is deadly to humans, infected cattle often show no symptoms. According to Greger (2007) the prevalence of E. coli O157:H7 in cattle can be attributed to the intensification of cattle production and the widespread use of antibiotics. Between 1978 and 1992, the number of animals in US livestock operations doubled (Tilman et al. 2002). Livestock production has shifted to industrial systems with high densities of confined animals. In the US, approximately 70% of antimicrobials used are given to livestock for “non-therapeutic purposes” (Mellon 2001). These antimicrobials have been shown to encourage the growth of E. coli O157:H7 (Kohler et al. 2000).
Some research has shown that changes in cattle diets, aimed to promote faster growth, increase the presence and prevalence of E. coli O157:H7. Cattle fed grain and high starch diets have higher levels of E. coli O157:H7 than cattle fed traditional high-fiber diets (e.g., hay or grass) (Berg et al. 2004; Diez-Gonzalez et al. 1998; Franz et al. 2005; Gilbert et al. 2008). Studies have shown that cattle fed distillers grain by-product from ethanol production are up to six times more likely to shed E. coli O157:H7 than cattle not fed distillers grain (Dewell et al. 2005; Jacob et al. 2008). Feeding cattle grain or grain by-products to maximize growth has become common practice; therefore manure may now have higher concentrations of E. coli O157:H7. Many speculate that the more acidic gut conditions created by a grain diets have fostered the evolution of the acid resistant E. coli O157:H7 strain.
Evidence continues to suggest that industrial livestock systems play an important role in the emergence of E. coli O157:H7. As actors, E. coli bacteria strive to survive and reproduce in changing environments. This particular strain of E. coli has evolved to produce a Shiga-toxin that can be deadly to humans. Scientists claim that this toxin provides a competitive advantage for the bacteria in the intestines of modern industrialized cows (Sheng et al. 2006; Steinberg and Levin 2007). While it remains challenging to understand the specific evolution of E. coli O157:H7, it has emerged as an increasing problem in both the ground beef and salad greens production systems.
The Industrialization of Ground Beef
There are two key stages in ground beef production: slaughter, where “harvesting” and “dressing” (i.e., primary disassembly) take place, and post-slaughter processing, where value-added activities take place. Industrialization of these processes appeared relatively early in meatpacking. As described by Upton Sinclair (1906), early slaughterhouses in Chicago involved horrific working conditions, low wages, and poor sanitation. At the turn of the 20th Century, labor was already organized in assembly line fashion, but there was relatively little mechanization. The highest skilled employees worked on the “kill floor” where approximately 70 men, who’s labor was highly segmented, would harvest and disassembled approximately 60 cattle, by hand, in an hour (Horowitz 2006: 37). These “primal” cuts of beef were sent by refrigerated rail to other metropolitan cities such as Detroit, Boston, and New York, to be process by neighborhood butchers according to their consumers’ requests (Giedion 1975; Horowitz 2006).
Between 1899 and 1954, output per man-hour increased 0.5% each year (Skaggs 1986: 189), but the process remained essentially the same. It was not until the post-WWII era that many new tools (i.e., stunners, mechanical knives, hide skinners, power saws, electronic slicing and weighing), as well as new architectural designs, became available and sparked a rapid increase in efficiency (Skaggs 1986: 189). New independent firms built “state-of-the-art” facilities in the Western Great Plains states, which increased labor output by 15%. These new companies entered the market in non-unionized, “right-to-work,” states where they were able to deskill workers and pay lower wages (Gouveia and Juska 2002). New plants were also closer to major livestock producing regions (Skaggs 1986), which reduced the costs of transportation in terms of both fuel and animal loss (Fitzgerald 2010). Older and smaller facilities could no longer compete against the new entrants and left the market (Ollinger et al. 2005).
In the decades that followed, tools such as pneumatic guns for stunning the animals, a continuous rail system, and mechanical washers came into use. These technologies greatly increased line speeds. By the early 1970s, modern plants were able to slaughter 179 animals an hour (Fitzgerald 2010) and up to 3,700 head in a day (Skaggs 1986). Concentration in beef slaughter tripled between 1977 and 1992, much of it taking place among newer entrants. Today, the top four beef slaughterhouses in the US process more than 94,000 head a day and they control nearly 84% of the market (Hendrickson and Heffernan 2007).
Industrialization can also be seen in value-added meat processing such as ground beef and pre-made hamburger patties, which now represents the largest segment of the industry (Romans et al. 2001). During the 1980s, a general decrease in beef consumption led to a reduction in wages and selling prices, a doubling of plant size, and the adoption of additional technologies, which led to a 50–80% increase in line speed (Fitzgerald 2010) and 45% increase in output per worker (Ollinger et al. 2005). At the same time, demand from fast food restaurants led to a 154% increase in the manufacturing of frozen ground beef patties (Schlosser 2002). Over the following years demand for beef patties continued to increase. By the late-2000s, it was estimated that 42% of the beef consumed in the US was in the form of ground beef (Giamalva et al. 2008: 3–6), representing 60% of in-home consumption, and 63% of consumption in food service establishments.
To meet increasing demand for ground beef and frozen patties, large-scale processors grind up to 12,000 pounds of beef an hour and operate for up to 20 h a day (Armstrong et al. 1996). Ground beef consists of commingled meat and trim from many carcasses. In an epidemiological study, Armstrong et al. (1996) investigated a large-scale processor who produced between 2 and 30 tons of ground beef per day and sold it to grocery stores packaged in 80-pound tubes. Investigators found that the raw ingredients were sourced from 11 different companies who were located in two different states and these companies sourced their ingredients from others. At the grocer, the meat was re-ground with older beef from their shelves, as well as their own leftover trimmings (Armstrong et al. 1996). This process of comingling and regrinding “raises the likelihood of pathogen spread through cross-contamination” and decreases traceability (MacDonald et al. 1996: 783).
Plants that make frozen hamburger patties utilize a continuous production system that “never stops” (Romans et al. 2001: 665). Larger patty plants run at a production capacity of 10,000 pounds per hour. They regrind incoming ground beef and trim, and form and freeze the patties, using various technological gadgets–grinder plates and knives, bone collection devices, bins and conveyor belts, serrated blades, and a cryogenic freezing tunnels or spirulators. Once frozen, the patties are stacked, boxed, and held in a freezer until shipment. According to Romans et al. (2001: 669), “extreme caution must be exercised so that microbial recontamination and growth do not occur somewhere in the process.” In summary, technological tools were applied to maximize economic efficiency resulting in large-scale concentrated facilities that mix meat from many sources. By the time a burger lands on a Bar-B-Q it is impossible to know whether or not cross-contamination occurred, if a product is contaminated where it happened, or even how many or which animals contributed to the consumers’ meal (Armstrong et al. 1996).
E. coli O157:H7 Outbreaks Linked to Ground Beef
Between 1998 and 2007 there were 9,824 illnesses linked to beef consumption (DeWaal et al. 2009). Beef has the potential to be contaminated by a range of harmful bacteria, but E. coli O157:H7 raises the most concern. Between 2008 and 2009 there were 51 beef recalls, 35 of which were ground beef, or trimmings used in ground beef, and 28 of these recalls were for suspected E. coli O157:H7 contamination (FSIS 2010). The 1993 Jack in the Box case is perhaps the most well-known outbreak. After 732 people in the Pacific Northwest were sickened, investigators found that the ground beef patties they consumed were contaminated with E. coli O157:H7. The company that supplied the meat had 111 suppliers, two of which were from outside the country (Juska et al. 2003: 13), and at the restaurant the hamburgers were not cooked at a temperature high enough to kill the bacteria. Close to 200 people were hospitalized and four children died. Other noteworthy cases, due to the volume of meat implicated, were associated with Hudson Foods Company of Arkansas and Topps Meat Company of New Jersey.
In 1997, Hudson, a supplier of hamburger patties for Burger King, Boston Market, and Wal-Mart, recalled 20,000 pounds of beef processed in their Nebraska plant. After the Colorado Public Health Department linked Hudson’s beef to five cases of E. coli O157:H7 poisoning (e.g., FSIS 1997), the recall was expanded to 25 million pounds, the equivalent of 100 million quarter pound hamburgers (e.g., USDA 1997). While further investigation confirmed only 16 cases of illness, it revealed safety lapses at the plant (e.g., lack of process control, poor record-keeping, weak testing program) and that the meat may have been contaminated during slaughter before it reached Hudson’s plant.
Similarly, Topps, a key supplier of frozen patties to Walmart, recalled nearly 332 thousand pounds of beef and beef products for “inadequate process controls” in 2007 (CDC 2007; Reiser 2007). Four days later, the recall was expanded to 21.7 million pounds, or nearly 87 million quarter pounders (Reiser 2007). In this case, contaminated trim from an outside source was blended with Topps meat. DNA analysis linked the contamination to a Canadian slaughterhouse. In total there were 45 confirmed illnesses, and one death, in five Canadian provinces and 40 confirmed illnesses in eight US states (Reiser 2007).
Cross-contact during hide removal is thought to be the primary cause of E. coli on beef, which means that contamination is likely to have occurred before it reaches the patty plant (Brichta-Harhay et al. 2008). Line speed is a contributing factor. As the harvesting and disassembly speed increases, the likelihood of errors from fatigue increases, as well (Hennessy 2005; Hughlett 2010). At large slaughter facilities, for instance, inspectors may examine 400 animals an hour (Machado 2003), which is likely to decrease their effectiveness (Hennessy 2005; MacDonald et al. 1996: 783). One inspector stated that “people are tired and I think the process puts the whole… (system) at risk” (Hauter 2008: 45).
In the case of Hudson and Topps, contamination was associated with in-plant failures and failures that occurred at slaughter, the latter of which is a function of industry concentration and both the volume and speed of production. As industry concentration grows, there are fewer sources for ingredients. In addition, smaller grinders are pressured to mix untested meat into their existing products as large suppliers have been known to refuse to sell to plants who test their incoming products for pathogens (Food and Water Watch 2009: 43). As a result, previously sound food may be contaminated during the mixing process.
Large plants run large lots of meat and trim through the same machinery. Should that machinery be contaminated, it raises the possibility that a large volume of ground beef will be contaminated (Armstrong et al. 1996: 45). In fact, FSIS testing found higher pathogen loads in ground beef from large-scale plants as compared to ground beef from smaller plants (Food and Water Watch 2009: 37). Since an increasing numbers of hamburger patties now come from a decreasing number of processors, when contamination occurs, it puts a larger number of consumers across a wide geographic area at risk.
As new technologies and industrial designs have been applied to maximize output and minimize costs, foodborne illness has emerged as an inherent side effect of these economically efficient ground beef production systems. Shifting to mechanized, concentrated, and large-scale processing facilities has increased the risks of introducing and spreading contamination. At the same time, decisions continue to be made to prioritize economic efficiency resulting in lapses in food safety protection programs, as seen through the examples of Hudson Foods and Topps Meat.
The Industrialization of Salad
As illustrated above, the production of large quantities of cheap uniform ground beef has led to serious problems. Due to the rise of the industrialized bagged salad, similar problems are now linked to salad greens. Upon examining a recent menu at a fast food restaurant, consumers will likely see packaged salads. Attention towards health and diet combined with increasingly busy lifestyles has largely solidified the popularity of the packaged salad. Following the same patterns as other food production systems, these salads are being made at large scales with minimal labor. Single-operator machines harvest the greens and they are sent to centralized processing facilities, where they are mechanically washed, chopped, and packaged. For food service companies, this dramatically cuts down on labor, saving time and money.
Processed salads and salad greens are also popular among consumers eating at home. In many cases, buying heads of lettuce has been replaced with buying “ready to eat” greens. These products are advertised as timesaving and healthy foods for the modern busy family. Sales of bagged salads increased approximately 560% between 1993 and 1999 (ERS 2001). Between 2005 and 2007, sales grew from $2.4 billion to $3.9 billion and are estimated to continue to grow by 204% between 2008 and 2012 (Mintel 2008). Bagged salads have come to represent one of the very top selling items in US grocery stores (Bates 2002), penetrating close to 75% of households (Food and Drink Weekly 2003).
Processed greens bring in $2.5 billion in annual US sales (Consumer Reports 2006) and are highly lucrative for companies like Dole and Fresh Express, who control 72% of the packaged salads market (Mintel 2008). Dole claims to be the leading salad brand, boasting that on average 2.1 million bags of Dole salad are consumed in the US each day (Dole 2010). Bagged salads first originated in the San Francisco area in 1986 and were produced at a small-scale. Upon realizing the potential value in processed salads, in 1989 Fresh Express created the first large-scale production system. Dole and other large-scale processors quickly followed. This value-added product allowed processors to pay farmers relatively little for salad greens while adding considerable value to the product through processing and packaging. Consumer Reports (2006) indicates consumers pay up to $11.17 per pound for “spring mix,” while some California farmers claim processors pay them 25 cents per pound.
Before exploring the salad processing facility, we must first examine production methods prior to processing. Production has been maximized and labor cut at the farm level (Friedland et al. 1981). Growers use large land areas to grow single crops and harvest them using mechanical mowers that cut and suck up greens. Greens are planted in rows with specific widths to match the dimensions of the mower. Although in the past many workers harvested greens by hand, now only one worker is needed to mow greens while another may walk in front of the mower to inspect for foreign objects (such as animal feces). Harvesting is most commonly done at night, when cooler, to maximize freshness and increase shelf life. Cut greens are put into large bins and then trucked to processing plants.
To increase efficiency, industrial processors designed production systems that maximized volumes while minimizing labor. A central part of this cost reduction design was to have only a few centralized processing facilities that use mechanical processing technologies. Companies harvest and buy greens from many dispersed regions and ship them to large facilities where the greens are mixed together. For example, Dole operates four vegetable processing plants in the US: one in Arizona, California, Ohio and North Carolina. All lettuce and greens grown for Dole throughout the US will be processed at one of these facilities. In California, an interview respondent indicated that in a given day greens from four different states might be mixed and processed together in one plant.
Within salad processing facilities, salad materials move at high speeds along conveyer belts and/or through large tubes. Facilities are kept at cold temperatures to maximize freshness. However, few workers are forced to endure these temperatures as the process is highly mechanized, producing thousands of bags of salad per hour. Greens entering the facility are mixed together in a series of one or more flumes, which are vats of water. Water in the flume is usually chlorinated to kill bacteria on the greens or other foreign matter collected by the mowers. Greens are also sorted on conveyer belts to remove foreign material and debris. Then the greens are cut to size, depending on the product, and washed multiple times (e.g., “triple washed”). The final stages of processing involve spinning the greens in industrial-sized salad spinners, known as centrifuge dryers, and then packaging them for distribution.
E. coli O157:H7 Outbreaks Linked to Salad
While salad processors use centralized, large-scale, and highly mechanized systems to minimize production costs, external costs have started to add up. These include widespread outbreaks of foodborne illness and massive recalls associated with bagged salads. The largest outbreak, which occurred in 2006, was traced to E. coli O157:H7. It was linked to bagged spinach, with a Dole label, that was produced in California (CDC 2010). This case resulted in close to 200 illnesses and at least 3 deaths, occurring in 26 US states and Canada. Also in 2006, 71 individuals in five states became ill from E. coli O157:H7 traced to processed lettuce from Taco Bell (CDC 2010). Between May and August of 2010, Fresh Express announced three recalls of salad products due to positive identifications of E. coli O157:H7, Salmonella, and Listeria. The August 2010 Fresh Express recall involved 22 different product labels and over 30,000 bags distributed in 26 states (FDA 2010).
These outbreaks have led some to more closely examine the role of large-scale production systems. The Community Alliance with Family Farmers reviewed outbreaks traced to California. They found that of those linked to E. coli O157:H7 in salad greens since 1995, processed and bagged products were responsible for over 98% of associated illnesses (CAFF 2008). Scientists from the Center for Disease Control have cited changes in processing, including the concentration of production systems, as a factor contributing to the increasing prevalence of foodborne disease in produce (Altekruse et al. 1997). Following the spinach outbreak, one newspaper story explained how Dole and Fresh Express truck their greens to “centralized processing plants where tainted and untainted leaves can be mixed during chopping, washing, and bagging,” thus increasing the likelihood for larger and more widespread outbreaks (Engel and Lin 2007). Opinion pieces by food critics called for a reexamination of both centralized processing systems and food safety laws (Nestle 2006; Pollan 2006; Schlosser 2006).
Interviews with respondents with inside knowledge of the salad industry revealed an agreement that certain production methods make processed salads more risky than heads or bunches of greens. Mechanical harvesting of greens increases the chances that other material, such as animal feces, could be brought into a processing plant. Mechanization also reduces the chances that contamination will be identified by workers in the field. Within concentrated processing facilities, giant flumes that combine greens from many farms can serve to spread contamination. Scientists agreed that the legal levels of chlorine used in flumes are not high enough to kill all bacteria that may be present. Within a flume, a fecal clump can take up to an hour to wash through. Meanwhile up to 4,800 bags of salad may be contaminated. Washing greens, even multiple times, cannot remove pathogens that reside in the grooves of plant leaves. Lastly, despite its popularity as a convenient form of packaging, food safety specialists admitted that the bag itself can serve as an incubator to culture bacteria, especially if exposed to warm temperatures during transportation or storage.
Throughout the industrial packaged salad system, technology has been applied to reduce labor costs and increase productivity. Mechanization has allowed for the mass production of processed salads at low costs. However, the external costs have become increasingly apparent as outbreaks and recalls associated with processed salads continue to make headlines. Labor saving technologies in harvesting and processing appear to have serious implications for food safety. The shedding of human labor while increasing production volumes creates more spaces for pathogens to be introduced, spread, and cultured. While industrial models may be efficient in terms of profitability, they may be inherently flawed in their ability to control unwanted bacteria. However, thus far, processors have been largely successful in diverting attention away from inherent problems with their economically efficient production designs.
Responses to Foodborne Illness
As an increasing number of outbreaks made national headlines, companies and government agencies were forced to respond to food safety problems. The Hazard Analysis and Critical Control Points (HACCP) regulation represented the official response to increasing outbreaks linked meat (FSIS 1996). This rule includes four mandates: that red meat processors establish and implement their own sanitation standard operating plans (SSOPs); that they establish a system of regular microbial testing for E. coli and Salmonella; that Salmonella standards be used to measure SSOP effectiveness; and that they not only identify all critical points within their slaughter, processing, and/or packing system, but also develop a monitoring plan that is justified with scientific data (Schuller 1998). While HACCP was opposed by certain segments of the meat industry, the state ultimately created HACCP based on industry’s terms. In meetings taking place behind closed doors, powerful industrial actors had great capacity to influence state policy-makers (Juska et al. 2000). This challenges simplistic notions of the application of science to create regulation, illustrating how rule-making benefits certain groups more than others (Worosz et al. 2008a).
HACCP represents an audit-based system (Dunn 2007). Food companies conduct much of their own self-monitoring while government responsibilities largely entail paperwork (Schlosser 2002; Nestle 2003). Recordkeeping is a large component of the Food Safety and Inspection Service (FSIS): “FSIS inspectors report that they spend 5 times as much time reviewing company paper under HACCP as they did under the previous inspection system” (Food and Water Watch 2009: 41). Critics point out that budget allocations lack sufficient funds for adequate site monitoring.
HACCP remains largely insensitive to scale. Microbial testing of beef is not volume-based. Instead, it is theoretically based on the complexity of plant activities and the number of HACCP plans is used as a proxy for risk. As a result, FSIS has focused more on regulatory compliance and enforcement in smaller plants that produce only 1% of US ground beef than in the larger plants (despite data revealing more positive E. coli samples from larger plants). After a series of recalls, in 2007 FSIS instigated a reassessment of HACCP. However, it ignored issues with large-scale production, focusing instead on monitoring, testing, and more technological tools (FSIS 2007). FSIS has only recently started to increase the rate of sampling relative to volume (FSIS 2009). For example, plants that process 250,000 pounds of ground beef a day are sampled up to 4 times a month whereas smaller plants, that produce 1,000 pounds a day or less, are sampled no more than once a month. However, this new sampling program will not yield statistically significant results, as testing frequency still fails to reflect the scale of production. Although HACCP continues to be altered in response to outbreaks, changes in rules and procedures represent small and often insignificant modifications that deflect attention away from underlying systemic issues and protect productionist goals.
While most large meatpackers already had HACCP-like programs in place, facilities have adopted a number of new technological fixes, often referred to as “interventions.” Rather than preventative actions that might necessitate a decrease in volume or slow the processing line speed, interventions include carcass cleaning with steam-pasteurization and acid washes, as well as and a range of decontamination activities that take place after hide removal including the use of acids and ammonia (FSIS 2007). Some of these interventions, however, have been found to have the opposite effect. For example, when rinses designed to kill pathogens are applied over large areas of a carcass, they can spread small numbers of pathogenic bacteria over these areas and render it invisible, although present in sufficient quantities to be harmful to public health (Hauter 2008; Juska et al. 2003).
Some leaders in the meat industry look towards irradiation as a “silver-bullet” solution to outbreaks. There are three forms of ionizing radiation. While each has its advantages and disadvantages, all of them produce free radicals that damage DNA and thus kill microorganisms. In 1999, the US approved irradiation of red meat and starting in 2000 commercially irradiated ground beef and beef patties became available in the market. These products remain offered in a number of grocery stores and they are also available through mail order (Farkas 2006), but the practice presents problems with quality and consumer acceptability. Furthermore, irradiation does not address fundamental problems associated with contamination, and critics argue that it could be toxic and will not serve to eliminate outbreaks (Harris 2008).
Following the 2006 E. coli O157:H7 outbreak linked to spinach, the leafy greens industry in California mobilized to address contamination. Prior to the outbreak, there were no formal standardized food safety rules for produce, as government agencies do not specifically regulate produce as they do meat, dairy, and eggs. While attempts to create new state legislation for food safety were thwarted by agribusiness supporters, the California governor supported the industry’s proposal to come up with their own food safety rules. These rules were produced under the California Leafy Greens Marketing Agreement, which remains voluntary but now includes 99% of the volume of leafy greens produced in the state (LGMA 2010). This method of governance satisfied the call for new food safety standards, while allowing the produce industry to create their own rules. Interviews revealed that industry “experts,” which included executives from large processing firms, created the LGMA rules behind closed doors without input from other stakeholders. Similar marketing agreements are now being used for leafy greens in other states and the USDA is reviewing a proposal for a national agreement.
New standards under the California marketing agreement focus on several specific aspects of salad production at the farm level. Rules focus on worker hygiene, specifying requirements for personal attire, cleanliness, and what is and is not allowed on farms. Rules also focus on the role of non-humans, primarily wildlife and livestock. Although evidence indicates that wildlife do not constitute a serious threat to food safety (Beretti and Stuart 2008), the LGMA designates certain animals as “animals of significant risk” and farmers must keep these animals away from farms using a variety of eradication measures (Stuart 2008, 2009). Additional measures focus on cleaning farm equipment and monitoring bathroom use and hand washing. These new standards focus on the farm level and do not address the designs of processing facilities or the role of processing in cross contamination.
Many processing facilities have voluntarily expanded their own food safety programs and, as seen in the meat industry, have adopted new technological fixes. Most processors have used HACCP as a guide in developing their food safety programs. Many have also gone beyond HACCP in developing new technical identification and trace-back mechanisms. Processors have explored and applied a variety of technological fixes to salad processing systems including new washing agents, such as ozone, and the use of a chlorophyll sensor to identify non-plant materials. Fresh Express recently announced the use of “FreshRinse”, a new acid rinse, claiming that it is much more effective than chlorine (Neuman 2010). Irradiation has also been approved by the FDA for iceberg lettuce and spinach (Harris 2008). However, barriers to adopting irradiation include the cost of irradiation technology, impacts to taste and texture, and slower production speeds. Industrial food safety specialists continue to explore ways to enhance irradiation and other new technologies to address food safety issues in salad processing. These technological approaches to address outbreaks overlook how concentrated and mechanized salad production systems continue to introduce and spread contamination.
Reflexive Responses to Foodborne Illness?
Both case studies illustrate a trend to preserve economically efficient production models while adopting technological fixes to address food safety problems. Regarding the meat industry, additional technologies may be successful in reducing the amount of pathogenic bacteria that survives within the processing system. However, this reduction may not be enough. Despite reductions in bacteria, by maintaining production systems where large amounts of meat are mixed from many sources, small amounts of contamination will be amplified and result in large-scale impacts (Juska et al. 2003). While the HAACP program does increase microbial testing, it does little to address systemic problems. Likely due to the powerful influence of agribusiness on state policymakers (Nestle 2003), government responses avoid any significant changes in the meat production system (Dunn 2007; Worosz et al. 2008a).
Similar to responses in the meat industry, the application of new rules and technological fixes in the packaged salad industry are unlikely to be effective. While farm management standards may reduce the chances that produce will become contaminated, processing systems will continue to facilitate cross contamination, amplifying any small amounts of bacteria. Technological fixes in salad processing may increase the identification and elimination of harmful bacteria, but complete elimination remains elusive. Mechanized harvesting and the mixing of greens continue to introduce and spread pathogens. Government has allowed industry to create their own food safety standards and little attention has been given towards significant changes to high-volume, low-cost processing designs that foster contamination.
Regarding both ground beef and packaged salads, processors have applied technological tools to redesign systems for economic efficiency and continue to rely on technological fixes to address emerging problems. Efforts to address foodborne illness remain focused on “fixing nature” rather than “fixing the system” (Juska et al. 2000, 2003). Adding technological fixes, while maintaining current production designs is unlikely to resolve problems. According to Scott (2011), in philosophical terms the application of technological fixes signals a continued faith in technological progress while masking larger social, political, and moral issues. In practical terms, technological fixes remain unlikely to be effective and can result in new and even greater problems in the future (Scott 2011). Problems with foodborne illness have emerged due to a single-minded approach to system design and technological fixes do not address systemic issues. Food-processing companies continue to protect economically efficient production models and deny responsibility for outbreaks (Nestle 2003; Stuart and Worosz, forthcoming). In fact, they tend to blame consumers for poor food handling practices (cf., Jacob and Powell 2009). Current efforts to address foodborne illness do not include systemic changes to reshape the application of technological tools in ways that reduce contamination. Therefore, these actions do not constitute reflexive responses.
A reflexive response would involve a transformation that results in new structures, new priorities, and new ways of thinking (Beck et al. 2003). Beck et al. (2003) illustrate that a reflexive response involves a realization that technological fixes will not solve problems and that systems need to be re-conceptualized and re-structured to adapt as problems arise. Overall, a reflexive response would internalize the external harms or costs (the externalities in economic terms). Responses from the ground beef and packaged salad industries have not shown signs of reflexivity in these terms. Instead, they have shown a continuing reliance on modernist views of adaptation through technology, ignoring signs that this approach may not be effective. In addition, responses may exacerbate current problems. For example, HAACP and marketing agreement requirements create financial hardship for small producers and may result in further industry consolidation and concentration (Worosz et al. 2008b).
Findings from these two case studies support the conclusion that it remains difficult to find evidence that reflexive modernization is occurring in the industrial food system. While, in ANT terms, food production networks have become unstable and the side effects have made life more difficult for food processors, we see no signs of a reflexive response. Consumers may be increasingly skeptical of industrial production; however, this level of consumer reflexivity (Chen 2008) has not transferred to food processors. Instead we see continued reliance on technological fixes and prioritizing economic efficiency over consumer well-being. While new quasi-objects have appeared (e.g., pathogens), we see no signs of new quasi-subjects (e.g., adaptive and reflexive leaders in the food industry). Although we do not see reflexive modernization happening, we agree that the concept remains a powerful narrative to bring attention to problems and examine what reflexive responses might entail (Latour 2003).
Conclusion: A Call for Justice and Reverence in Food Production
Exploring reflexive responses to foodborne illness brings us back to the Greek myth of Prometheus. Responses relying on quick technological fixes illustrate a lack of respect for non-humans, overlooking the significant role of bacteria in food systems, and perpetuate injustice by prioritizing profitability over consumer well-being. Technological tools and systems designs have been applied with the narrow goal of economic efficiency, while ignoring how decisions may result in ecological changes and harm to consumers. Systems that maximize production and minimize costs, while ignoring factors that amplify contamination perpetuate this injustice. Drawing from the myth of Prometheus, we argue that food processing companies have created production systems in the absence of reverence and justice. This has resulted in hubris, injustice, and negative impacts (e.g., foodborne illness). While responses to outbreaks avoid systemic causes, the story of Prometheus can help us to imagine how greater reverence and justice might guide more reflexive responses to foodborne illness.
Regarding reverence, the complex relationships between humans and non-humans within food systems have been dangerously overlooked. Here, we have used ANT to highlight the role of non-humans and their relationships with technology and production systems. From our findings, we agree that through centralization and homogenization we are imposing unnatural patterns onto non-human actors (Waltner-Toews 1991). For example, scientific evidence increasingly supports the belief that E. coli O157:H7 evolved as a result of changes in industrial livestock production including high-density confinement, the widespread use of antibiotics, and long-term grain feeding (Diez-Gonzalez et al. 1998; Berg et al. 2004; Franz et al. 2005; Gilbert et al. 2008; Greger 2007; Kohler et al. 2000). A lack of attention towards the impacts of these changes may have fostered the emergence of the highly virulent pathogen now contaminating beef and produce. Food companies continue to operate while attempting to ignore important ecological relationships. Showing great signs of hubris, processors continue to overlook or underestimate how industrial production systems reshape relationships with bacteria resulting in large-scale outbreaks. As stated by Waltner-Toews (1991: 53): “We are not free of the ecological fabric in which we evolved. The more we crowd ourselves and our farm animals on this planet, the more we will continue to see, reflected in trends in food-borne illness… We may have lengthened the chain, but the chain is still there…”
A more reflexive approach to food production would entail acknowledgement and reverence for the ecological complexity of food systems and adopting a more holistic and adaptive view. Designers of food systems would need to be mindful of the possible changes new technologies and production models might bring regarding relationships with non-humans (e.g., bacteria and livestock). According to Beck (1992 and 2003), reflexive modernization involves new ways of thinking and seeing. In this case, a holistic view and approach would allow for greater acknowledgement of the non-human actors and relationships, which ANT highlights. As discussed by Waltner-Toews (1991), we need to stop looking at the pieces and start looking at the whole—how social and ecological systems intertwine to create foodborne illness. Increasing awareness and respect for ecological processes, would foster more adaptive and reflexive responses. In this case, a reverent response would entail addressing the industrial livestock practices that have fostered the emergence of E. coli O157:H7 as well as how mechanized, high-volume food processing systems introduce, spread, and culture pathogens.
Regarding justice, industrial players continue to benefit while others, including consumers, farmworkers, and small producers, face negative impacts. The food industry continues to prioritize capital accumulation and the state continues to support (or at least allow) this goal to drive policy despite harm to citizens. Players in the food industry maintain great influence over government (Casey 1998; Fortin 2003). The creation of new food safety standards for both ground beef and packaged salads involved powerful industrial actors promoting rules that use technological fixes and preserve production goals. We need to ask ourselves if current food safety strategies are democratic and knowledge based or are they autocratic and driven by an unwavering loyalty to productionist goals and technological solutions (Waltner-Toews 1991). In both cases, rule making was far from transparent and included few participants with narrow goals. Industrial players continue to influence governing bodies (Worosz et al. 2008b) and hinder efforts to substantially overhaul food safety standards. Financial ties to agribusiness and food suppliers have weakened justice in food governance (Nestle 2003) and responses to health issues continue to be influenced by private interests (Barling 2007). As highlighted in this paper, whether government agencies or new neoliberal governing bodies create the rules, standards to address foodborne illness have been shaped by large-scale, capital intensive industries and focus on their narrow interests. Therefore system designs continue to reflect processors decisions to prioritize profitability over the well-being of others.
As illustrated by Beck et al. (1992 and 2003) a reflexive response entails a restructuring of institutions in transformative ways. To respond reflexively to the impacts of industrial food production, current injustices need to be addressed through our food governance institutions. We need to address inequality in our political system and foster rule making that takes a greater range of stakeholders, information, and values into account. The power of industry over state governing bodies needs to be exposed and addressed so that narrow goals of industry profitability no longer dictate policy. In addition, rule making needs to be more transparent with greater diversity in participation. Regarding food safety, shifting priorities and taking consumer protection more seriously might entail transitioning to a more decentralized system that avoids large-scale cross-contamination (DeLind and Howard 2008). Current rules for food safety still prioritize economic efficiency, but other interests and values need to guide decision-making.
While these prescriptions to increase reflexivity in food production may be followed in certain rare cases, large-scale changes are needed to truly foster reflexivity. Bos and Grin (2008) illustrate the barriers to “doing reflexive modernization” within current food systems. They illustrate how small-scale producers may change their actions, but are confined by the larger institutional structures within which they remain embedded. While we currently see an increasing number of small-scale alternatives to industrial food systems, existing food production regimes continue to hinder large-scale change. Bos and Grin (2008) suggest that small counter-movements will not result in reflexive change without support from larger institutions. Therefore, greater reverence and justice in food systems is needed at institutional levels. While small scale alternatives continue to provide more options for consumers and draw attention towards problems with industrialized food production, reflexive social movements and increased activism may be the best means to pressure elected officials and government agencies to be more transparent and to reprioritize the goals shaping food system governance.
This discussion takes on greater importance as the US Congress recently passed new food safety legislation. After numerous bills were debated (Worosz 2009), in 2011 the Food Safety Modernization Act (FSMA) was signed into law. The FSMA focuses on coordination in food safety governance and calls for increased monitoring and enforcement. It also calls for the continued application of science and technology to address problems. Because it neglects to address current high-efficiency systems, cross contamination, and weaknesses associated with technological fixes, the FSMA does not call for reflexive change. In addition, while regulations have yet to be drafted, they will likely include rules that cause financial hardship for small-sized producers, therefore furthering industry consolidation and concentration. Government responses continue to support efficient production in economic terms, while failing to address systemic flaws. This illustrates an ongoing commitment to technological solutions, while ignoring how narrow productionist goals continue to cause harm.
Exposing the negative impacts associated with industrial food production systems reveals that the notion of efficiency remains an illusion (Weis 2010). As production networks become increasingly unstable and the myth of efficient food production becomes further exposed, it remains uncertain whether we will eventually witness a reflexive response to current problems. We contend that these problems have emerged as a result of applying new technological tools and re-designing processing systems with a narrow focus on profitability while overlooking ecological relationships. The current systems need to be reevaluated. We may not need to change the specific technological tools used in food processing, merely how they are applied. For now, we continue to witness hubris and injustice in food processing and a continued reliance on technological fixes to industrial problems. However, the continued application of technology in this manner may prove to be even more problematic in the future. As stated by Waltner-Toews (1991: 55): “An increase in infectious food-borne diseases may be Nature’s rallying cry for a complete restructuring of our food-producing system.”
The authors would like to acknowledge the Alabama and Michigan Agricultural Experiment Stations. Funding for research in California was made possible through an award from the Estuarine Reserves Division, Office of Ocean and Coastal Resource Management, National Ocean Service, United States National Oceanic and Atmospheric Administration. Additional funding for research in California was provided through a National Science Foundation Doctoral Dissertation Improvement Grant. Lastly, we greatly appreciate comments from our reviewers.