View more articles about

"I hope this book sheds some light on this very popular idea that once we find out what went wrong in a technological accident, it will never happen again," says Greg Siegel. "The book tries to show the sort of folly in that way of thinking. The stories we tell ourselves about what went wrong are always incomplete. There is always something that is a little troublesome, some sort of residual uncertainty." (Credit: Unsplash)

fears

Why we crave data when things go wrong

A new book considers our cultural and scientific fascination with accidents, particularly the need to explain what’s often unexplainable.

“The accident thwarts even the most technologically advanced attempts to tame it,” writes Greg Siegel, an associate professor of film and media studies at University of California, Santa Barbara. His book is titled Forensic Media: Reconstructing Accidents in Accelerated Modernity (Duke University Press, 2014).

“When things fail, they remind us of the limits of human endeavor and human finitude, of death.”

Broadly defined, forensic media are the graphic, photographic, electronic, and digital technologies used to record and reconstruct accidents, especially high-speed crashes or such catastrophes as a commercial airliner vanishing from the sky. The term “accelerated modernity” refers to modern society’s affinity for speed—and Siegel’s study of mainly speed-related crashes.

“I had been thinking a lot about media technology and culture,” says Siegel, who also directs the graduate studies program in the film and media studies department and is co-director of UC Santa Barbara’s Center for the Interdisciplinary Study of Music.

“And I thought, ‘What if we were to think about the relationship between technology and culture not through the lens of progress and how things are getting better but by thinking about what happens when things go wrong?'”

‘A residue of fear’

Fittingly, the book opens with a recounting of the mid-20th-century origins of Murphy’s Law—the ubiquitous idea that “what can go wrong will go wrong.” The phrase was allegedly coined following groundbreaking experiments conducted at Edwards Air Force Base that tested the limits of human tolerance to rapid deceleration.

Forensic Media is a book about failure that roars through history, documenting the evolution of forensic engineering and the use of media to record and reconstruct accidents.

[related]

The drive to obtain a “purely scientific understanding of an accident is never quite successful,” says Siegel. “We always have a residue of fear, a certain unknowingness because we can never really get our arms around it.”

The accidents and related technology that unfold in Forensic Media largely involve planes, trains, and automobiles.

Siegel highlights, for example, mathematician-engineer Charles Babbage’s “self-registering apparatus,” which in 1839 enabled railroad trains to track such variables as speed and force of traction. It is a “strange contraption,” Siegel writes, meant to further study the performance of trains and their accidents, an early notion of forensic media that would not be widely adopted for more than 100 years.

Seventy years later, the flight-data and cockpit voice recorders commonly known as “black boxes” became crucial digital and acoustic evidence in helping to account for why an Air France flight disappeared in the middle of the night in 2009.

Siegel also covers the science of automobile crash testing and development of crash-test cinematography. “Almost uniformly, early 20th-century complaints had to do with people’s misbehavior behind the wheel of the automobile,” Siegel says. “There was a sense that society could protect itself from car accidents by disciplining the so-called reckless driver.”

By the 1940s, that thinking had begun to change in earnest, Siegel says. It reflected a societal shift toward blaming machines—and looking at their failures as a means of explaining away accidents—that persists today.

Scary reminders

To conduct his research, Siegel combed through physical archives, plumbed government and scientific manuals and handbooks, and screened audio and visual documents. Some of the cockpit voice recordings he listened to have not been officially released, he notes.

The book is aimed at people interested in the general relationship between technology and culture, Siegel says, as well as students and professors interested in questions of design and technology, the history and science of philosophy, and film media studies.

While conducting his research, he was repeatedly struck by the language that was often used to talk about technological accidents. “Across a lot of different fields—in pop culture as well as in government and industry documents—technological failure was often likened to human failure. We think about wrecks as corpses and failure as death,” Siegel says.

“When things fail, they remind us of the limits of human endeavor and human finitude, of death.”

In the modern industrial era, a theological or “almost mystical supernatural understanding” of accidents has been replaced by an overarching faith in scientific analysis, which has its own downside, Siegel continues.

“I hope this book sheds some light on this very popular idea that once we find out what went wrong in a technological accident, it will never happen again,” Siegel says.

“The book tries to show the sort of folly in that way of thinking. The stories we tell ourselves about what went wrong are always incomplete. There is always something that is a little troublesome, some sort of residual uncertainty.”

Source: UC Santa Barbara

Related Articles