From The Field Guide to Understanding ‘Human Error’ to Drift into Failure, from Ten Questions About Human Error to Safety Differently, Sidney’s books are moving and pushing established paradigms of safety. With his ‘cri-de-coeur’ Just Culture, his heartfelt Second Victim and even his more technical Patient Safety, he reminds readers around the world of the human face of safety, both before and after things go wrong. Here you will find information on–among others—these recent titles by Sidney:
- The end of Heaven: Disaster and Suffering in a Scientific Age (2017)
- Just Culture: Restoring Trust and Accountability in Your Organization (2016)
- Safety Differently: Human Factors for A New Era (2015)
- The Field Guide to Understanding ‘Human Error’ (2014)
- Second Victim: Error, Guilt, Trauma and Resilience (2013)
- Just Culture: Balancing Safety and Accountability (2012)
- Drift into Failure: From Hunting Broken Components to Understanding Complex Systems (2011)
- Patient Safety: A Human Factors Approach (2011)
- Behind Human Error (2010)
Before I could even finish the Preface of your new book, one word kept turning over and over in my mind. That word was delicious. I haven’t had a similar reaction to other books I’ve read, although I’ve found the writings of C. S. Lewis to be particularly inviting. Your word usage makes each sentence a meaty delight. I’ve spent a great deal of time in the study of philosophy, so I was thrilled to read your perspective, using philosophical terms like a surgeon uses a scalpel.
Here is further encouragement. Your mind is unique. In many ways you have been blessed with perspectives that are beyond our time and space. You are an oversized billboard with the warning, ‘You’ve Settled for the Easy Solution.’ Not many have the grasp of philosophy and beyond-space-time critical thinking that you do. Despite their education, they have settled for the easy road, the easy methodology, the easy answer to difficult questions. Please, please stay on the course that seems to be leading you. You have presented brilliance to us in your book, but it might take some readers longer to see it.
I wish you well, Sidney. I am enriched by knowing you.
Dr. Todd Hubbard, Oklahoma State University
This book argues that in a scientific age, we have become ever better at explaining the complexity of disaster (at answering the epistemological question of what happened). But people, for a variety of reasons, desire simple answers to the age-old existential question of why we suffer.
When we confuse the question of what happened with the question of why we suffered, we either supply wrenchingly inadequate epistemological accounts (it was ‘human error!’ Rule violations!), or callously emotionless explanations of disaster incubation, normalization of wrongs and other banal features of human and organizational life. The former doesn’t explain well, the latter doesn’t console well.
This creates a double bind that everyone involved in the investigation, prevention, and mitigation of disaster, as well as in dealing with its aftermath—in whatever field of human and technological endeavor—runs up against, whether consciously or not.
This book goes to the heart of this double bind by juxtaposing models and experiences of disaster and suffering, showing how the two often diverge, and how they might ultimately be reconciled. Click here for a short video briefing about The End of Heaven.
A just culture is a culture of trust, learning and accountability. It is particularly important when an incident has occurred; when something has gone wrong. How do you respond to the people involved? What do you do to minimize the negative impact, and maximize learning? This third edition of Sidney Dekker’s extremely successful Just Culture offers new material on restorative justice and ideas about why your people may be breaking rules. Supported by extensive case material, you will learn about safety reporting and honest disclosure, about retributive just culture and about the criminalization of human error. Some suspect a just culture means letting people off the hook. Yet they believe they need to remain able to hold people accountable for undesirable performance. In this new edition, Dekker asks you to look at ‘accountability’ in different ways. One is by asking which rule was broken, who did it, whether that behavior crossed some line, and what the appropriate consequences should be. In this retributive sense, an ‘account’ is something you get people to pay, or settle. But who will draw that line? And is the process fair? Another way to approach accountability after an incident is to ask who was hurt. To ask what their needs are. And to explore whose obligation it is to meet those needs. People involved in causing the incident may well want to participate in meeting those needs. In this restorative sense, an ‘account’ is something you get people to tell, and others to listen to. Learn to look at accountability in different ways and your impact on restoring trust, learning and a sense of humanity in your organization could be enormous. This book:
- Defines what a just culture is
- Offers new material on restorative justice and ideas about why your people may be breaking rules
- Is supported by extensive case material, through which you will learn about safety reporting and honest disclosure, and about retributive just culture and about the criminalization of human error
- Asks you to look at accountability in ’different’ ways
- Shows how to minimize the negative impact of just culture interventions, and maximize learning
‘Readers interested in organizational ethics and decision-making will benefit from the case studies and examples. Summing Up: Recommended. Lower- and upper-level undergraduates; general readers.’ Choice, February 2013
’…it is difficult to think of a more relevant and challenging book for health and safety practitioners, company managers and directors, regulators of all stripes, and members of parliament.’ Safeguard, New Zealand, Jan/Feb 2013 ‘
Sidney Dekker’s book is a thought-provoking exposition of the concept of a just society. Would that we could achieve it! The questions that the author raises need to be discussed at all levels of government, and by judges and lawyers, and by ministers of health. Dekker makes it clear that profound changes must be made in both the legal and the medical systems if we really wish to improve medical safety.’ John W. Senders, University of Toronto, Canada
‘A timely book about the current major safety dilemma – how do we resolve the apparent conflict between increasing demands for accountability and the creation of an open and reporting organisational culture? Thought-provoking, erudite, and analytical, but very readable, Sidney Dekker uses many practical examples from diverse safety-critical domains and provides a framework for managing this issue. A ‘must-read’ for anyone interested in safety improvement, but also, one hopes, for politicians, law-makers and the judiciary.’ Dr Tom Hugh. MDA National Insurance Ltd, Sydney, Australia
Safety has increasingly become a bureaucratic accountability. It is concerned with reducing negative events and sees the human factor as a problem to control. Despite the transformation brought about by human factors, which asked not who was responsible, but what was responsible for triggering errors and failures, safety thinking once again tends to target people with behavioral interventions, rather than the system, the technology, or the environment in which people work.
The unrelenting pace of technological change and growth of complexity call for a different kind of safety thinking today. A kind of thinking that sees the human factor as a source of diversity, resilience, insight, creativity and wisdom about safety—not as a source of risk that undermines otherwise safe systems. The key transitions for human factors in a new era are these:
- From seeing people as a problem to control, to seeing people as a solution to harness;
- From seeing safety as a bureaucratic accountability up, to seeing it as an ethical responsibility down;
- From seeing safety as an absence of negatives to seeing it as the presence of a positive capacity to make things go right. A focus on safety and risk should become a focus on resilience.
- From a Cartesian-Newtonian language of linear cause-effect relationships, of static metaphors and individual components, to a language of complexity, change and evolution, holistism and relationships.
- From vocabularies of control, constraint and human deficit, to new vocabularies of empowerment, diversity and human opportunity.
Safety Differently invites the reader to see the human factor as a solution to harness, to turn safety back into an ethical responsibility for those who do our dangerous work. It concludes that these are concerns that are not only inextricably linked to everyday safety, but also to justice and progress in our workplaces and communities. Click here for a short video briefing on the book Safety Differently.
“Sidney Dekker has established himself as the foremost thought leader on accident causation and human error. He points out that we continue to follow linear thinking about accidents and look at the person and the choices they make as the problem. Thus, we develop ineffective interventions intended to “fix” workers through motivation, training, and discipline. … Through this book, Dekker calls on safety professionals to stop and think critically about the path forward. He calls for us to engage in a conversation about how we look at human error. The time has come for a new era that better understands human error in the context of work, and the overriding importance of improved work design; design that is tolerant of human error and allows humans who make mistakes or become confused to fail safely.”
—Richard A. Pollock, President, CLMI Safety Training and American Society of Safety Engineers
“As expected Sidney Dekker compels the next level of productive thinking. It is a challenge to think broader and react less. He tells the how and why of “old view” sociotechnical embeddedness and reveals why its usefulness has diminished … Sidney writes in such a way that the whole book becomes an example of applied “local rationality”. … He provides strong motivation to embrace the hard work of developing a holistic perspective mindset and break free of dualistic deconstructionist approaches and language.”
—Paul Nelson, MSc, Nelson HF Safety Consulting, LLC
“… an exciting exposé of the current system of safety management and how it came to be. … Professor Dekker asks us to look beyond the purely technical, and to reflect on our feelings about safety processes. Then he presents a clear story about why these feelings might be preventing us from producing the very changes that are needed to move to the next level of safe operations. He probes us to explore what fundamentally makes safety such an elusive challenge and what makes it different from other sciences. … provides the framework that will move us to a new level of practice and thinking that could be to this generation of safety practitioners, what technical “fixes” were to the safety managers of the 1970s.”
—Ivan Pupulidy, US Forest Service
“… After reading Sidney’s work you feel inspired to change the ‘way we’ve always done business’ and to look at safety management in a very different way. This book is very timely against the strengthening tide of criminalization of failure — it counters by providing a sound perspective on system complexity and foreseeability — it recognizes that ethics have taken a back seat to safety over the bureaucratic control it so often has become. This book is indeed a ‘stop and think’ — its content provides concepts for critical thinking and invites, challenges and persuades all those who care about safety to think and act differently.”
—Jenny Colman – Human Factor Specialist, Fatal and Serious Injury Investigation Dept. WorkSafeBC
“I believe this book will become a foundational reference for all students and practitioners and promoters of system safety initiatives and interventions in complex social organizations and work situations. The comprehensive nature of the approach adopted in this book is based on both a strong historical understanding of the topic as well as an impressive appreciation of the important philosophical underpinnings of system safety efforts. Dekker has laid a strong historical and philosophical foundation on which he builds operationally relevant guidance about sense-making in complex adaptive systems.”
—Dr. Robert Robson, Healthcare System Safety and Accountability
“… Here in one volume is an authoritative account that is rich in Prof Dekker’s unique experience of safety, science and his experience of safety in vastly different domains. The result is challenging and surprising, And at last there is one book that brings the various strands of these influences into what we call today safety science.”
—Anthony Smoker, Manager Operational Safety Strategy NERL/NATS
“… easily accessible for practitioners and really inspiring and provocative for scientists. Dekker’s reasoning is amazingly easy to follow, especially when he is challenging various folk models, which are often strongly incorporated in our thinking. The history of safety science and of role of human in systems is pictured masterfully. But the main strength is that it offers smooth intellectual ride from “stone-age” safety thinking to resilience engineering. Of course, smooth and comfortable for readers, for the world of safety is a struggle. But at least there is a inspiration.”
—Hubert K. Adamczyk, Polish Air Traffic Controllers Union (Executive Vice President); Human Factors Specialistand Safety Investigator
“… easy to read and to understand. … written in such a way that also interested people from outside the safety field can understand … the first book that I ‘m aware of, that challenges the dominating view/beliefs on the role of the human factor (based on modernist assumptions) within the safety domain. … Brilliantly written … a very interesting view on the way modern safety is shaped by the past and how it could be of influence on the future. … has the potential to unlock a more human approach of safety.”
—Ruud Plomp, ManageNet/Thin Green Line, The Netherlands
Please visit Taylor & Francis/CRC Press to order your copy of Safety Differently
When faced with a ‘human error’ problem, you may be tempted to ask ‘Why didn’t these people watch out better?’ Or, ‘How can I get my people more engaged in safety?’ You might think you can solve your safety problems by telling your people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure and demanding compliance. These are all expressions of ‘The Bad Apple Theory’ where you believe your system is basically safe if it were not for those few unreliable people in it.
Building on its successful predecessors, the 3rd edition of The Field Guide to Understanding ‘Human Error’ will help you understand a new way of dealing with a perceived ‘human error’ problem in your organization. It will help you trace how your organization juggles inherent trade-offs between safety and other pressures and expectations, suggesting that you are not the custodian of an already safe system. It will encourage you to start looking more closely at the performance that others may still call ‘human error’, allowing you to discover how your people create safety through practice, at all levels of your organization, mostly successfully, under the pressure of resource constraints and multiple conflicting goals.
The Field Guide to Understanding ‘Human Error’ will help you understand how to move beyond ‘human error’; how to understand accidents; how to do better investigations; how to understand and improve your safety work. You will be invited to think creatively and differently about the safety issues you and your organization face. In each, you will find possibilities for a new language, for different concepts, and for new leverage points to influence your own thinking and practice, as well as that of your colleagues and organization.
If you are faced with a ‘human error’ problem, abandon the fallacy of a quick fix. Read this Field Guide to understanding it instead. Click here for a short video briefing on The Field Guide to Understanding ‘Human Error.’
‘If you design equipment or operating procedures, if you investigate accidents or deal with safety, this is an essential book. Sidney Dekker, a leading world authority on “human error” has enhanced his already exceptional “Field Guide” to be a concise, readable guide to both design of equipment and procedures and also the analysis of mishaps. The label “human error” is misleading and its use prevents discovery and correction of the true underlying causes of incidents. So read about hindsight bias, about the difference between the view from inside the system rather than from outside, and about difference between the blunt end (where you should do your work) and the sharp end (where people tend to focus). Read, learn, and put these ideas into practice. The results will be fewer incidents, less damage, less injury.’—Don Norman, author of The Design of Everyday Things
‘It’s in the world’s best interest to read Dekker’s book. The Field Guide is nothing short of a paradigm shift in thinking about “human error”, and in my domain of software and Internet engineering, it should be considered required reading. This Third Edition is much better than the Second, and the layout of the material is far more accessible.’—John Allspaw, SVP, Infrastructure and Operations, Etsy
‘The Third Edition of Sidney Dekker’s Field Guide to Understanding “Human Error” provides a solid practical framework for anyone wanting to make sense of safety science, human factors analysis, and the New View approach to conducting investigations of incidents and accidents. The trademark direct and passionate style that is common in Dekker’s work focuses on the circumstances of frontline operators and managers working in complex systems, as well as the challenges of the safety investigator. Dekker does not mince his words (“Any human factors investigation that does not take goal conflicts seriously does not take human work seriously”) and is clearly supportive both of sharp end workers, who are tasked with creating safety in the face of resource constraints in complex systems, as well as the investigators, charged with making sense of events that often seem surprising and unpredictable. This will be an invaluable resource for any organization serious about understanding and improving the safety of their operations.’ —Dr Robert Robson, Principal Advisor, Healthcare System Safety and Accountability.
‘When things go wrong in organisations, one thing is almost always found in the post-mortem: “human error” (in various guises). But one only needs to scratch the surface of system failures to understand that things are not so straightforward. What seems to make sense as a causal catch-all for our everyday slips and blunders snaps when stretched; it fails to explain the context and complexity of our work and systems. There is a better way. In this important book, Sidney Dekker conveys a practical approach for life after “human error”. It is both humanistic and systemic; it treats people holistically and non-judgementally, while considering system conditions and dynamics in context. If you are prepared to suspend your own preconceptions and reactions to failure this book will repay you with a practical, highly readable and deeply humane approach to dealing with failure.’—Steven Shorrock, European Safety Culture Program Leader, Eurocontrol.
What if you are doing your job and you injure or kill someone? What if you injure or kill lots of people in an accident which you survive without much of a scratch? And what if it was your job specifically to not have that accident happen? If you are the one who should have noticed, should have acted, should have prevented. If it was your job, your duty to protect your passengers, your patients, your workers; your job to assure the quality, the integrity of your process, your operation.
Chances are, you will become the second victim of your incident or accident. The unique predicament of the second victim, however, has never before been examined systematically—neither its psychological, social nor ethical aspects. In some sense, the second victim is like surviving first victims: there can be trauma, shock, loss, anger, possibly injury. But then there is guilt. The guilt that comes from violating duty, violating trust, violating responsibility, and for causing the thing that should have been prevented. And there is blame—self-blame and blame by others. This can turn into lawsuits, and increasingly into criminal prosecution.
Together, these after-effects form a potent destructive package, which many individuals and organizations are ill-equipped to handle. Some second victims decide life is no longer worth living, and commit suicide. What are the possible psychological and emotional experiences of the second victim? What can an organization do to help? Where are the sources of hope, of resilience? And how can helping a second victim also help first victims?
Drawing on his unique background as psychologist, airline pilot, safety specialist, and his own experiences with helping second victims from a variety of backgrounds, Sidney Dekker has written a powerful, moving account of the experience of the second victim. It forms compelling reading for practitioners, risk managers, human resources managers, safety experts, mental health workers, regulators, the judiciary and many other professionals. Click here for a short video briefing on Second Victim.
“With Dekker’s books there is always a thought-provoking central theme. … Part of the effectiveness of this book is in its inclusion of the reader – I felt as if I was being guided through my own thoughts, rather than someone else’s words. … I cannot recommend this book enough, especially if you work in systems design or organisational culture. It is easy to read, but may leave you with more questions than it answers. I do not think that is a bad thing.”
—Ella-Mae Hubbard, Loughborough University, Loughborough, UK
A just culture protects people’s honest mistakes from being seen as culpable. But what is an honest mistake, or rather, when is a mistake no longer honest? It is too simple to assert that there should be consequences for those who ‘cross the line’. Lines don’t just exist out there, ready to be crossed or obeyed. We – people – construct those lines; and we draw them differently all the time, depending on the language we use to describe the mistake, on hindsight, history, tradition, and a host of other factors.
What matters is not where the line goes – but who gets to draw it. If we leave that to chance, or to prosecutors, or fail to tell operators honestly about who may end up drawing the line, then a just culture may be very difficult to achieve.
The absence of a just culture in an organization, in a country, in an industry, hurts both justice and safety. Responses to incidents and accidents that are seen as unjust can impede safety investigations, promote fear rather than mindfulness in people who do safety-critical work, make organizations more bureaucratic rather than more careful, and cultivate professional secrecy, evasion, and self-protection. A just culture is critical for the creation of a safety culture. Without reporting of failures and problems, without openness and information sharing, a safety culture cannot flourish.
Drawing on his experience with practitioners (in nursing, air traffic control and professional aviation) whose errors were turned into crimes, Dekker lays out a new view of just culture. This book will help you to create an environment where learning and accountability are fairly and constructively balanced.
‘With surgical precision Sidney Dekker lays bare the core elements of a just culture. He convincingly explains how this desired outcome arises from a combination of accountability and (organisational) learning. The real-life cases in the book serve to drive his arguments home in a way that will be easily recognised and understood by practitioners in safety-critical industries, and hopefully also by rule makers and lawyers.’ — Bert Ruitenberg, IFATCA Human Factors Specialist
The airline industry is under immense pressure and is full of sometimes serious contradictions. ‘Staff are told never to break regulations, never take a chance yet they must get passengers to their destination on time. Staff are also implored to pamper passengers yet told not to waste money. The contradictions are at worst a receipt for disaster and at best low staff morale and lead to dishonesty as staff fear consequences and for good reason. Just Culture is essential reading for airline managers at all levels to both understand the endless conflicts that staff face trying to deliver the almost undeliverable and to reconcile accountability for failure with learning from that failure. A soul searching and compelling read.’ — Geoffrey Thomas, Air Transport World
The book “Just Culture” by Professor Sidney Dekker, presented to Capt. Sullenberger during a ceremony in New York.
During an award ceremony in New York’s City Hall on Monday February 09, the US Airways crew, that recently successfully landed the A 320 aircraft onto the Hudson River, was awarded keys to the city of New York by Mayor Michael Bloomberg. Bloomberg also presented the book “Just Culture” to Capt. Chesley “Sully” Sullenberger, who had to leave his copy on board the aircraft.
What does the collapse of sub-prime lending have in common with a broken jackscrew in an airliner’s tailplane? Or the oil spill disaster in the Gulf of Mexico with the burn-up of Space Shuttle Columbia? These were systems that drifted into failure. While pursuing success in a dynamic, complex environment with limited resources and multiple goal conflicts, a succession of small, everyday decisions eventually produced breakdowns on a massive scale.
We have trouble grasping the complexity and normality that gives rise to such large events. We hunt for broken parts, fix- able properties, people we can hold accountable. Our analyses of complex system breakdowns remain depressingly linear, depressingly componential—im
The growth of complexity in society has outpaced our under- standing of how complex systems work and fail. Our tech- nologies have gotten ahead of our theories. We are able to build things—deep-sea oil rigs, jackscrews, collaterized debt obligations—whose properties we understand in isolation. But in competitive, regulated societies, their connections prolifer- ate, their interactions and interdependencies multiply, their complexities mushroom.
This book explores complexity theory and systems thinking to better understand how complex systems drift into failure. It studies sensitive dependence on initial conditions, unruly technology, tipping points, diversity—and finds that failure emerges opportunistically, non-randomly, from the very webs of relationships that breed success and that are supposed to protect organizations from disaster. It develops a vocabulary that allows us to harness complexity and find new ways of managing drift. Click here for a short video briefing on ‘Drift into Failure.’
‘…meticulously researched and engagingly written …explains complex system failures and offers practical recommendations for their investigation and prevention… A valuable source book for anyone responsible for, or interested in, organizational safety.’ – Steven P. Bezman, Aviation safety researcher
’… Dekker draws his inspiration from the science of complexity and theorises how seemingly reasonable actions at a local level may promulgate and proliferate in unseen (and unknowable) ways until finally some apparent system “failure” occurs. As with all Dekker’s books, the text walks a fine line between making a persuasive argument and provoking an argument. Love it or hate it, you can’t ignore it.’ – Don Harris, HFI Solutions Ltd
‘Both provocative and insightful, the author shines a powerful light on the severe limits of traditional linear approaches. His call for a diversity of voices and narratives, to deepen our understanding of accidents, will be welcomed in healthcare.’ – Rob Robson, Healthcare System Safety and Accountability, Canada
‘Professor Dekker explodes the myth that complex economic, technological and environmental failures can be investigated by approaches fossilized in linear, Newtonian-Cartesian logic. …Serious proponents of the next high reliability organizations would do well to absorb Drift into Failure.’ – Jerry Poje, Founding Board Member of the U.S. Chemical Safety and Hazard Investigation Board
’Today, catastrophic accidents resulting from failure of simple components confound industry. In Drift into Failure, Dekker shows how reductionist analysis … does not explain why accidents in complex systems occur. Dekker introduces the systems approach. Reductionism delivers an inventory of broken parts; Dekker’s book offers a genuine possibility of future prevention. The systems approach may allow us to Drift into Success.’ – John O’Meara, HAZOZ
‘This is an important and timely book which reflects a deep understanding of how and why systems fail, ranging across all domains from financial markets to medical practice to engineered systems. A key strength of the book is that the author, Sidney Dekker, is able to delve into the nitty gritty details of many failure case studies, while also deploying a strong background in philosophy of science and the developing science of complex systems.’ — Irfan A. Alvi, President and Chief Engineering, Alvi Associates.
‘This book will change the way you see the world and think about systems in general. Any industry that you work in will benefit from reading this book. It shows our arrogance and simple view of a complex world and how it catches us by surprise.’ — Amazon
‘A very interesting and important book in the field of safety science.’ — Amazon
‘This book is a noteworthy effort to provide new insights into how accidents and other bad outcomes occur in large organizations. Dekker begins by describing two competing world views, the essentially mechanical view of the world spawned by Newton and Descartes (among others), and a view based on complexity in socio-technical organizations and a systems approach. He shows how each world view biases the search for the “truth” behind how accidents and incidents occur.’ — Amazon
Risks to patients are many and diverse, and the complexity of the healthcare system that delivers them is huge. The creation of safety cannot be up to a few good doctors, or up to some exceptionally dedicated nurses or technicians. The human factors approach, taken in this book, refuses to just lay the responsibility for safety and risk at the feet of people at the sharp end. Instead, the human factors approach looks relentlessly for sources of safety and risk everywhere in the system—the designs of devices, the teamwork and coordination between different practitioners, their communication across hierarchical and gender boundaries, the cognitive processes of individuals, the organization that surrounds and constrains and empowers them, the economic and human resources offered, the technology available, the political landscape, even the culture of the place.
With coverage ranging from the influence of professional identity in medicine and problematic nature of “human error”, to the psychological and social features that characterize healthcare work, to the safety-critical aspects of interfaces and automation, this book spans the width of the human factors field and its importance for patient safety today. In addition, the book discusses topics such as accountability, just culture, and secondary victimization in the aftermath of adverse events and takes readers to the leading edge of human factors research today: complexity, systems thinking and resilience. Click here for a short video briefing on Patient Safety: A Human Factors Approach.
‘User-friendly and well written, this book takes the complex nature of healthcare seriously and pulls no punches. It demonstrates what the human factors approach can and does do, providing excellent examples to tease out the subtleties of this fascinating subject.’
— The RoSPA Occupational Safety & Health Journal, June 2012
‘This book usefully applies the perspective Dekker develops in his other books specifically to patient safety. The basic idea is that the peformance of any complex system, including the healthcare system, depends primarily on the overall behavior of the system, considering the interactions of system components (human and otherwise), rather than primarily the performance of the components (eg, doctors) themselves. This means that when the system fails to perform and safety is compromised, we should emphasize examining and reforming the system, rather than trying to blame and root out individuals judged in hindsight to be bad apples. I generally agree with this view, and there’s abundant evidence to back it up, though we shouldn’t go to the extreme of claiming that there are never any bad apples – we need some balance here. For more details of this general view, read my review of Dekker’s ‘Drift into Failure’ book. For the specifics of applying this view to patient safety, do indeed read THIS book!’ — Amazon
‘This is a great book. A 5 star book’ — Amazon
‘This is a nice introduction to the study of human factors in patient safety. The author uses examples and narratives to make what some consider a dry topic very interesting. All healthcare providers should avail themselves to the information contained in this quick read.’ — Amazon
‘Good comprehensive treatment of the literature on systems thinking. If you are looking to learn about complexity theory and systems thinking, this book would serve you well. The author has original content, plus extensive reference to the literature.’ — Amazon
Behind Human Error (2010)
“Human error” is cited again and again as a major contributing factor or “cause” of incidents and accidents. Most people accept the term “human error” as one category of potential causes for unsatisfactory activities or outcomes. The result is a widespread perception of a “human error problem, and solutions are thought to lie in changing the people or their role in the system.
For example, we should reduce the human role with more automation, or regiment human behavior by stricter monitoring, rules or procedures. But in practice, things have proved not to be this simple. The label “human error” is prejudicial and unspecific, and any serious examination of the human contribution to safety and to system failure indicates that the story of “human error” is markedly complex.
This book takes you behind the label “human error.” In a journey through four different parts, it invites you to discover the most significant research results of the last decades. You get to explore how the changing understanding of accidents and an embrace of systems thinking has radically impacted ideas about “human error.” Then you learn about the role of normal cognitive system factors (knowledge, mindset, and goals) in operating safely at the sharp end. Then you study how the clumsy use of computer technology can increase the potential for erroneous actions and assessments in all kinds of fields of practice. And finally, you will learn how the hindsight bias always enters into attributions of error, and that “human error” is a mere label; the result of a social and psychological judgment process rather than a matter of objective fact that we can count, tabulate, punish or eliminate.
‘This is an excellent book on why failures occur and general approaches that can be used to reduce incidence of failures. I highly recommend it.’ — Irfan A. Alvi
‘This is an interesting textbook and while it is difficult in places, I think it is essential reading for those designing or operating complex systems.‘ — Health and Safety at Work, December 2010
When faced with a human error problem, you may be tempted to ask “Why didn’t they watch out better? How could they not have noticed?”. You think you can solve your human error problem by telling people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure. These are all expressions of “The Bad Apple Theory” where you believe your system is basically safe if it were not for those few unreliable people in it. This old view of human error is increasingly outdated and will lead you nowhere.
The new view, in contrast, understands that a human error problem is actually an organizational problem. Finding a “human error”, by any other name, or by any other human, is only the beginning of your journey, not a convenient conclusion. The new view recognizes that systems are inherent trade-offs between safety and other pressures (for example: production). People need to create safety through practice, at all levels of an organization.
Building on its successful predecessor, The Field Guide to Understanding Human Error guides you through the traps and misconceptions of the old view. It tells you how to avoid the bias of hindsight, the temptation of counterfactual reasoning and judgmental language, and how to go beyond the people who were closest in time and place to the mishap. It then explains how you can apply the new view of human error to the analysis of safety problems and the construction of meaningful countermeasures. It even helps those who want to help their organizations adopt the new view and improve their learning from failure.
‘Dekker’s approach points the way.’ — Amazon
‘This should be required reading for folks conducting Root Cause analysis.’ — Darren Colin Bittick, Courtland, VA
‘Excellent resource for any accident investigator’ — Amazon
‘Insightful, useful, refreshing. A must-read for anyone tired of the “old view” of human error’ — Boyd Falconer, University of New South Wales, Australia
‘It is accessible, practical, eminently readable and will be of great use to safety practitioners whatever their background.’ — Health & Safety at Work, July 2007
‘This past year I read your book The Field Guide to Understanding Human Error based on a recommendation of a colleague. I must admit it is one of the best book that I have read on accident prevention and safety. I have been practicing as a construction safety professional for 17 years and have struggled to accurately and completely articulate the concepts you so eloquently describe in your book. Although it draws many examples from an aviation safety standpoint, your book stands up brilliantly as a framework for understanding human error and accident prevention in any industry. Subsequently, I am using it as the text for my course “Safety in the Construction Industry” here at Columbia this fall. The construction industry is so very stuck in the world of the “Old View.” Convincing construction management professional that removing bad apples is not the answer is a tough sell. Your book is making my job quite a bit easier. Thank you.’ — Ray Master, Columbia University, USA
‘ No matter if the reader is an upper level executive in an aerospace company, a member of an accident investigation team, a safety engineer, or a university student, Sid’s Field Guide is equally as useful. This book presents important ideas for those who regulate human factors investigation and research, making it an essential read for the academician, the research analyst, and the government regulator’
— International Journal of Applied Aviation Studies
Essential reading for any safety investigator. An eye-opening way to transform your investigations by moving from the old-view to the new-view. I’ve used this book as a ‘course book’ for a seminar of 25 safety professionals to great effect. Plus there is a good guide to the role of a safety department too. — Andrew Evans
‘This book is the perfect introduction to systems thinking when trying to understand accidents, improve safety, and make systems more resilient. The examples are great, and the author’s perspective comes through loud and clear. He puts in clear relief the “old way” and “new way” of thinking about error, lays out his case for transitioning to the new way, and does it all clearly and concisely. Great read! I’ll be buying extra copies to lend to colleagues.’ — M. Rayo
When you lose situation awareness, what replaces it? Why do safe systems fail? Should we hold people accountable for their mistakes? Why don’t people just follow the procedures? These and other questions make up the “Ten Questions About Human Error”, a remarkable mix of human factors, history, philosophy, sociology, ethics and organizational science.
Human factors and system safety are dominated by mechanistic, structuralist models that emphasize components and linkages between them. The influences from Descartes and Newton can be seen everywhere, from the information processing models of cognitive ergonomics, to the latent failures in organizational safety metaphors, to the sequence-of-event (action-reaction) models of accidents, to the preference for quantitativist, experimental human factors research. Consistent with the individualist emphasis of Protestantism and Enlightenment, human factors and system safety also keep taking the individual as their central focus. This leads to problematic interpretations of the human contribution to safety and accidents, and possibly counterproductive notions of control and culpability. While human factors and system safety can point to remarkable successes, the continued usefulness of their models can be understood only if we acknowledge the limits. Problems facing human factors and system safety today‹for example the drift into failure‹show how we need new system models oriented towards organic relationships, transactions and constraints‹not components and mechanical linkages.
The ten questions about human error are not just questions about human error as a phenomenon‹if they are that at all. They are questions about human factors and system safety as disciplines. This book attempts to show where current vocabulary, models, methods, ideas and notions are hampering progress. In every chapter, the book tries to provide directions for new ideas and models that could perhaps better cope with the complexity of problems facing human factors and system safety now.
‘This is a superb book. I’m a physician – it should be part of our medical school curriculum’ — Paul H. Bearmon, MD
‘A superb book about philosophy. We’ve reached the limit of what we can achieve using this paradigm. The accidents we are seeing today no longer fit the “someone screwed up” model of failure. Instead we’re seeing normal people doing normal work in normal organizations, everyone doing what to them appears to be the right thing to do, and yet we still have these occasional enormous failures.‘ — David Deley
“The text is extremely straightforward….it will be a valuable read for anyone interested in system safety-no matter what their field. Interesting reading and thought provoking discussions.” — IEEE EMBS Journal
‘Excellent look at human error. I read it from a healthcare perspective and found the concepts powerful. I will look at healthcare error in a new way after reading this book.’ — Kenneth L. Naylor
‘Dekker is a real master about the subject. There is not a single issue he leaves unattended. The depth of analysis is impressive. Dekker reminds us of Rasmussen -another giant about safety issues- in his depth of analysis.’ — Jose Sanchez Alarcos
The Field Guide to Human Error Investigations (2002)
Human error may be the dominant contributor to incidents and accidents today, it is probably also the most misunderstood. Little or no guidance is available for those who need to reconstruct the human contribution to system failure. Human error investigations are often forced to follow a path of intuition or common sense and are exposed to the distortions of hindsight.
Investigators in a number of operating worlds (aviation, shipping, rail) have expressed their desire for a guide that could help them in their investigative work, by suggesting methods, reminders, pointers that are well-grounded in the theoretical underpinnings of understanding human performance in complex contexts. The Field Guide aims to offer practitioners this, as well as ways in which to produce credible, well-documented findings.
The Field Guide’s premise is that human error is systematically connected to features of the tasks and tools that people work with, and to features of the environment in which that work is carried out. The premise, in other words, is the local, or bounded, rationality principle: people do reasonable things given their knowledge, their objectives, their limited resources. The key in human error investigations is to understand the situation as it surrounded people at the time and trace people’s unfolding mindset in parallel with how processes evolved around them. The key for an investigator is also to not confuse people’s understanding at the time with his or her own understanding of the situation now‹in other words, to avoid the hindsight bias. The Field Guide is centered around a method of how to do all of this.
In order to contrast two dominant approaches to human error (today known as “the old view” versus “the new view” of human error), the book is divided into two parts.
I accidentally encountered your “Field Guide” and bought a copy. Wonderful! You have done an excellent job of demonstrating the need for a systems approach to accidents, where “human error” is an unacceptable causal component. Well done. I hope the book has a large impact upon accident investigators in all industries. The book focuses upon aviation, but obviously the lesson applies to all.
Thanks for the book. I will soon post a glowing review and recommendation on my website.