Advertisement

Journal of Commercial Biotechnology

, Volume 16, Issue 4, pp 281–283 | Cite as

Federal bureaucrats slip on oil spill

  • Henry I Miller
Commentary
  • 240 Downloads

The record-breaking oil spill in the Gulf of Mexico, which began in April following an explosion on BP's Deepwater Horizon oil rig, fouled some of the most beautiful beaches and valuable fishing waters anywhere. There appears to be plenty of blame to go around, beginning with the oil company, formerly British Petroleum, which operated the rig that hemorrhaged record-breaking amounts of oil. (The company used to have an advertising campaign called ‘Beyond Petroleum’ that touted its diverse approaches to producing energy; if it is held liable for the cleanup and economic losses, BP might come to mean ‘Below Profitability’. Within 6 weeks of the explosion, the markets had shaved off half of the company's capitalization.)

Whoever is responsible for causing the catastrophe, one US regulatory agency certainly deserves plenty of blame for preventing the development of a potentially effective high-tech method to mitigate the effects of oil spills. It is a technology that would have been particularly suited to degrade the oil that washed onto the beaches along the Gulf Coast from Texas to Florida.

During the 1980s microorganisms genetically engineered to feed on spilled oil were developed in laboratories but Draconian federal regulations discouraged their testing and commercialization and ensured that the techniques available to respond to these disasters remain low-tech and marginally effective. They include methods such as deploying booms to contain the oil, spraying chemicals to disperse it, burning it and spreading absorbent mats.

At the time of the catastrophic 1989 Exxon Valdez spill in Alaska, there were great expectations for modern biotechnology applied to ‘bioremediation’, the biological cleanup of toxic wastes, including oil. William Reilly, who at that time headed the U.S. Environmental Protection Agency (EPA), later recalled, ‘When I saw the full scale of the disaster in Prince William Sound in Alaska … my first thought was: Where are the exotic new technologies, the products of genetic engineering, that can help us clean this up?’

Reilly should have known: Innovation had been stymied by his own agency's hostile policies toward the most sophisticated new genetic engineering techniques. The regulations ensured that biotech researchers in several industrial sectors, including bioremediation, would continue to be intimidated and inhibited by regulatory barriers. Those policies remain in place today, and the EPA's anti-technology zealots show no signs of changing them.

The best and surest way to prevent such accidents is, of course, to obtain energy from sources other than fuel oil. Biofuels have been widely touted as a possibility but solutions to technical difficulties, such as breaking down plant materials so that they can be metabolized into ethanol, have thus far eluded scientists. Ironically, EPA regulation has also inhibited the development of the genetically engineered bacteria and fungi that are needed.

Thus, EPA's policies have for decades stymied safe energy production in two ways – by preventing innovation applied to industrial processes that could produce biofuels and by obstructing the development and commercialization of oil-eating organisms that could be used after a spill.

Characteristically, the EPA didn’t let science get in the way of policy. Its regulation focuses on any ‘new’ organism (strangely and unscientifically defined as one which contains combinations of DNA from unrelated sources) that might, for example, literally eat up oil spills. For the EPA, then and now, ‘newness’ is synonymous with risk, and because genetic engineering techniques can easily be used to create novel gene combinations with DNA from disparate sources, these techniques therefore ‘have the greatest potential to pose risks to people or the environment’, according to the agency press release that accompanied the rule.

But science says otherwise. The genetic technique employed to construct new strains is irrelevant to risk, as is the origin of a snippet of DNA that may be moved from one organism to another: what matters is its function. Scientific principles and common sense dictate which questions are central to risk analysis for any new organism. How hazardous is the organism you started with? Is it a harmless, ubiquitous organism found in garden soil, or one that causes illness in humans or animals? Does the genetic change merely make the organism able to metabolize and degrade oil more efficiently, or does it have other effects, such as making it hardier and more resistant to antibiotics and therefore difficult to control?

The EPA ignored the widely held scientific consensus that holds that modern genetic engineering technology is essentially an extension, or refinement, of earlier, cruder techniques of genetic modification. In fact, the US National Research Council observed in 1989 that the use of the newest genetic engineering techniques actually lowers the generally minimal risk associated with field testing. The reason is that the new technology makes it possible to introduce pieces of DNA that contain one or a few well-characterized genes, in contrast with older genetic techniques that transfer or modify a variable number of genes haphazardly.

All of this means that although no technology is risk-free, users of the new techniques can be more certain about the traits they introduce into the organisms. Nevertheless, organisms crafted with the newest, most sophisticated and precise genetic techniques are subject to discriminatory, extraordinary regulation. Research proposals for field trials must be reviewed repeatedly case by case, and companies face uncertainty about final commercial approvals of products down the road even if they prove safe and effective.

Ironically, two of the people most responsible for the EPA's longstanding unscientific and anti-innovative policies have been involved in the cleanup of the Deepwater Horizon spill: Carol Browner, who headed the EPA when its anti-biotech policy was finalized during the Clinton years, is now the Obama administration's environmental policy czar; and William Reilly, head of the EPA when the policies were first crafted, is co-chairman of Obama's oil spill investigation commission. They are living proof of the Washington DC adage that no bad deed goes unrewarded.

The US Department of Energy, which conducts research in a variety of applied science disciplines, including bioremediation, also has shied away from the use of genetically engineered organisms. According to Aristedes Patrinos, associate director of DOE's Office of Biological and Environmental Research during the Clinton administration, the department was unwilling even to perform field testing of genetically engineered organisms for bioremediation – because of ‘public resistance’. (Residents of the US Gulf Coast might feel differently now.) No profiles in courage awards for him.

Government policymakers seem oblivious to the power of regulatory roadblocks to impair resilience. Experiments using genetically engineered organisms confront massive red tape and politics and require vast expense. The costs and uncertainty of performing this R&D have virtually eliminated them as a tool to clean up oil spills and other pollution.

If individually and collectively we are to meet economic, environmental and public health challenges, we need plenty of options and opportunities for innovation. But in large and small ways, unimaginative, short-sighted bureaucrats, politicians and activists have conspired to limit our options, constrain economic growth and make real solutions elusive.

Copyright information

© Palgrave Macmillan, a division of Macmillan Publishers Ltd 2010

Authors and Affiliations

  • Henry I Miller
    • 1
  1. 1.Hoover Institution, Stanford UniversityStanfordUSA

Personalised recommendations