Flagging Down a Runaway Technology

Share Button
Illustration by Brian Biggs

One day last May, a CSX freight train rolled across northwestern Ohio for two hours, hauling tank cars. Nothing particularly remarkable about that—except that no one was manning the engine, and two of the tank cars were filled with toxic chemicals. Fortunately, a railroad worker was able to jump aboard and stop the train before a catastrophe ensued.

For Dr. William Evan, emeritus professor of sociology and management, that runaway freight train is a perfect metaphor for the dangers posed by modern technology—and a wakeup call for society to do something about it.

“If we fail to mind the machines,” says Evan, “we run the risk of finding ourselves on a runaway train with no engineer at the helm, and that could lead to a disaster with untold fatalities.” In response to disasters both historical and potential, he and Dr. Mark Manion, assistant professor of philosophy and professional ethics at Drexel University, have written a book titled Minding the Machines: Preventing Technological Disasters, which has just been published by Prentice Hall.

Evan and Manion are not talking about “acts of God”—earthquakes, hurricanes, and the like. Nor are they concerned with deliberate acts of terrorism, though they do praise Johnson & Johnson for its quick and enlightened reaction to the Tylenol-cyanide poisonings in the 1980s. Their subject is the accidental dark side of technology, which has concerned Evan ever since the twin djins of nuclear weapons and nuclear power began rearing their heads. That concern over nuclear power was realized at the Chernobyl nuclear-reactor catastrophe in 1987 through a deadly fusion of human errors, faulty reactor designs, and a lack of preparedness programs.

Technological disasters are not “mere accidents” but rather “disasters directly linked to human action,” Evan and Manion stress. “More often than not, they result from explicit or implicit corporate or governmental policies,” which only fuels the outrage. While the most convenient scapegoat is usually an employee, the root causes are often more complex.

One root cause is the technical design of a component, such as the flawed gas-tank design of the Ford Pinto in the early 1970s, which had a tendency to burst into flames upon being rear-ended. Compounding the ignominy was the fact that Ford executives knew of the faulty design but chose to ignore it.

Another root cause is “organizational,” and often characterized by poor communication between top management and rank-and-file employees—thus depriving management of “information and knowledge necessary for rational and prudent decision-making.” Evan and Manion cite the 1981 collapse of suspended walkways over the atrium lobby of the Kansas City Hyatt-Regency hotel, which killed 114 and injured at least 200 others. The problem there stemmed from a design change in the walkways—and poor communication within the engineering firm that designed them.

Evan and Manion describe the final “root cause” as “socio-cultural,” referring to “attitudes and values that are widely accepted by people in a society,” and which “penetrate the attitudes and values of the corporate culture of various firms.” A prime example of that was the 1984 poison-gas release from a Union Carbide plant in Bhopal, India. In addition to lax governmental controls, incompetent operator training, and inadequate emergency preparedness and community education, they write, Union Carbide management’s “perception of the depreciated value of life in India resulted in negligent plant design, which did not include various fail-safe devices.” Had that plant been as carefully designed as a similar Union Carbide facility in West Virginia, the disaster would not have happened.

Although bad publicity and costly lawsuits have made corporate decision-makers more sensitive to the dangers of technological disaster, Evan is concerned that the lightning-fast evolution of technology will spread from First World countries to Third World nations that “don’t have engineers, scientists, and executives that can keep up with changing technologies.”

Evan and Manion recommend that engineers and corporate managers build in a “set of redundant fail-safe devices” and devise special training programs to instill a “commitment to basic safety procedures in all levels of personnel.” That commitment should be spelled out in a corporate code of conduct, with which “all employees should be familiar.”

In addition to developing “effective mechanisms to ensure compliance with moral standards expressed in codes of conduct,” Evan and Manion note, organizations should also develop “corporate ‘social audits,’ which can identify negative social and environmental impacts of organizational actions.”

They also recommend that engineering schools emphasize responsible engineering design and ethics throughout their curricula, and suggest a similar emphasis on responsibility for business schools. 

“To avoid runaway engines leading to disaster,” says Evan, “you have to have engineers alert enough and courageous enough to intervene before 
disaster strikes.”

Share Button

    Related Posts

    Alien Minds, Immaculate Bullshit, Outstanding Questions
    Maintaining Focus
    Digital Player

    Leave a Reply