1999 From: Cornell University News Service
Reassuring Our Trust In TechnologyITHACA, N.Y. -- Much of the world of technology, says Trevor Pinch, is built on trust: Trust that the engineers have done their job responsibly, trust that they have the right expertise to do the job properly. "We can never escape the need for trust," he says. "We trust computers. We trust flying in Boeing 747s." Pinch's seemingly benign view of technology is based not on certainty but on fallibility: The notion that even the best intentions go awry, that even well-designed technology can go wrong occasionally. Trust, he says, is based on the curiously reassuring fact that because technology is fabricated by human hands "it is messy and uncertain." Pinch, a Cornell University professor of science and technology studies, has enlarged and illustrated his ideas -- which can hardly be a surprise to any engineer -- in his latest book The Golem at Large (Cambridge University Press, 1998), co-authored by Harry Collins, professor of sociology at Cardiff University. The two collaborated on an earlier book, The Golem: What Everyone Should Know About Science (Cambridge University Press, 1993), which caused some resentment in the scientific community for its message that science is no guarantor of truth because it is subject to the same messy process of human-directed experiment. The choice again of the Golem, the stumbling giant of East European Jewish mythology, as a metaphor is particularly apt for the two authors' latest book, because the seven case studies presented show exactly the often inchoate nature of technological progress. "We shouldn't over-mythologize technological certainty in the same way we shouldn't over-mythologize scientific certainty," says Pinch. "That message might have been shocking to some scientists, but I don't think any engineer would be shocked." The examples range from the disputed effectiveness of Patriot missiles in the Gulf War and the assigning of blame for the crash of the Challenger shuttle mission, to the role of lay experts in making "vital contributions to technical decisions" about the fallout from the Chernobyl nuclear disaster and AIDS treatment. In cases of technological failure, such as the space shuttle, Pinch says, invariably humans are accused of causing the failure rather than the failure being recognized as one of a complex technology. "We tend always to go for the more human explanation when technologies go wrong; that's our point," he says. In the cases of the Chernobyl nuclear fallout, where British farmers quickly saw the effects on their animals, and of AIDS, where victims offered often-ignored advice, the book makes the powerful point that expertise is where the local knowledge is. "Lay people can have expertise," says Pinch. "That doesn't mean that one voice is as good as another, but that people can contribute and gain expertise. Scientists shouldn't reject that." The message, he says, is that no one should reject expertise, wherever gained. Particularly, he believes, there should be trust in the work of scientists and technologists "because they are trustworthy people -- they are experts, just like your dentist, and you should trust them in the same way." That message was too much for sociologist Barry Barnes of Exeter University to accept. Writing in the journal Nature, he argued that "it is worth asking whether a book designed to attack claims of certainty and omniscience would not be better entitled 'what you should know about propaganda.'" However, reviewer Roy Porter noted in New Scientist that the authors' tone "is reassuring, calling for dialogues, not duels, between science and the humanities." And a review in Publishers Weekly commented: "Their book is worthy of note not only for its clear analysis of how science can come up short when applied outside of the laboratory but for its honest appraisal of technology's gatekeepers."
| |