The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, Enlarged Edition
B**E
Doesn't really get to the bottom of things
Despite extensive ethnographic studies, Vaughan's lack of familiarity with aerospace engineering and engineering organizations results in some very wrong understandings of what risk means in engineering. Her idea of Normalization of Deviance is insightful and useful, but it is a hand that she grossly overplays. And she sees that phenomenon as being more present in engineering judgement than in management, possibly because NASA managers succeeded in buffaloing her during interviews. Her concept of Construction of Risk is a mess. Viewing engineering risk analysis/management only through the lens of post-Kuhnian social constructivism results in incoherent statements like:"It opens the 'black box' of engineering, following the negotiation of risk and the production of technical knowledge by working engineers making the hands-on risk assessments. At the same time it shows the intersection of the social and the technical in the construction of risk, it gives insight into how scientific paradigms are created, the sources of their obduracy, and the circumstances in which they are—and are not—overturned."Vaughan never seems to grasp that risk, in engineering, comprises a probability component and a hazard-severity component. In many places where she concludes that deviance has been normalized, she incorrectly judges that engineers and managers had changed their beliefs about the severity of a hazard (erosion and blow-by) rather than that they, rightly or wrongly, concluded (based on evidence) that the probability of the unwanted occurrence was only negligibly changed by newly acquired evidence (i.e. defensible, rational Bayesian belief updates).She likewise fails to understand what redundancy means, and how it is used, in the design of complex systems with catastrophic failure modes. She views redundancy as a belief – an irrational, unfounded belief – of engineers: “the belief in redundancy was fundamental to their decision not to report.” Again, her wrong understanding of redundancy in design seems to have its basis in poor explanations given to here by NASA in interviews, including some surprisingly ignorant perspectives of NASA management: “You don’t build in redundancy and never expect to use the back-up. If you never use your back-up, you’re wasting money” (George Hardy, NASA).This book is definitely worth reading, if you’re interested in organizational dysfunction or systems engineering. If your background is engineering, beware of her narrow sociology (Merton-esque) perspective. If your background is sociology or management science, beware of her limited grasp of engineering and risk science.
J**L
Poor printing
The last letter of many characters on the "left" (back) pages is not completely printed. It's surprisingly distracting to interpret that partial letter so often.The language (the literal words) in the book is unnecessarily dense and complicated - as if the author wrote it for her fellow sociologists instead of the general public.
R**K
A must resource for technical leadership researchers
Thanks to Diane Vaughn, her research assistants, NASA employees, and industry managers for helping operational managers in all disciplines to objectively assess the risks associated with pioneering technology. The book is written to a technical audience, but is formatted and resourced to the extent that methods and conclusions are applicable to many disciplines.
A**S
Classic work studying culture, engineering, and organizations
Very detailed account. I think the main thing missing is a discussion of the different kinds of engineering tests, how they work, and the beliefs associated with each one. But otherwise it is quite exhaustive.It's also a very useful methodological exemplar for people using archives to study institutional decision-making.
S**S
Dense but thorough
As a safety representative in a large organization with codified methods for submitting safety concerns this has been essential reading to my continued education. This mishap was not about the O-ring. It was about the paradigm in which evaluations of the O-ring were made. A structured, seemingly thorough way of determining safety concerns which showed its problems in one large explosion.
K**N
Any similarity to Boeing's greed is not just a coincidence folks.
Great and disturbing book reflects what is happening with the Boeing planes today.
G**E
Best but Weighty Analysis
The best book on the subject, but it is a sociology study and may be a bit dense for casual readers. That being said, it is an exceptional analysis of the many factors that contributed not only to the decision to launch, but to the normalization of deviance that led a very intelligent and dedicated group of engineers to incrementally accept increasingly risk, and the organizational dynamics that prevented them from reaching a decision that seemed obvious to outside observers.
D**A
... exhaustively well-researched and footnoted and the author does an amazing job interviewing various subjects close to the Shuttle program
The book is obviously exhaustively well-researched and footnoted and the author does an amazing job interviewing various subjects close to the Shuttle program. But the issue is that it reads like a term-paper. The book is so dry I’m surprised it didn’t spontaneously combust in my hands. I left it in Mexico. Perhaps the next people in my room will be interested in reading what amounts to a term paper whilst sunning themselves in Tulum, Mexico.
J**.
Present.
I bought this as a present and it's gone down very well.
A**R
Fantastic analysis but heavy reading.
A huge amount of evidence has contributed to this book. It's an interesting study into how people, teams and organisations can fail - weaknesses to which we are all vulnerable. Heavy prose style won't suit everyone though.
L**B
Five Stars
One of the best books on its subject.
C**D
Excellent read
Brilliant book really enjoying reading such a factual view of the desaster and its consequences
M**I
Faszinierende Analyse mit breiter Anwendbarkeit
Diane Vaughan entwickelte ihr Konzept der Normalization of Deviance im Zuge ihrer Analyse der Genese des Challenger Unglücks 1985. Doch diese Art von Prozessen findet häufig auch in anderen Organisationen statt. Vaughan zeigt uns, wie Menschen und Organisationen in Katastrophen hinein schlittern, indem sie schrittweise ursprünglich gültige Normen immer weiter verwässern und dabei die Fehleranfälligkeit der betroffenen Institution immer weiter erhöhen.
Trustpilot
1 week ago
2 weeks ago