Safety–critical industries: definitions, tensions and tradeoffs

The purpose of this blog post is to define what is meant by the term safety- critical industry and to identify the tensions and trade-offs at play in these complex organisational settings in which my research is situated.Gas turbine engine

Falla defines safety-critical systems as ones “which need to possess the highest levels of safety integrity, where malfunction would lead to the most serious consequences” (Falla, 1997, p2). Wears describes safety-critical industries as “complex socio technical systems comprised of people in multiple roles and their societal and technical artifacts” (Wears, 2012, p4561). Combining these two definitions, a safety-critical industry can be said to be a system comprising individuals, technology and organisations in which safety is of paramount importance and where the consequences of failure or malfunction may be loss of life or serious injury, serious environmental damage, or harm to plant or property. A number of authors exemplify this broad definition of safety-critical industries by listing specific industry sectors which exhibit these characteristics. Commonly quoted examples of such industries are nuclear power plants, off-shore oil platforms, chemical plants, commercial aviation, and rail transport (Baron & Pate-Cornell, 1999; Amalberti, 2001; Kontogiannis, 2011; Wears, 2012).

Amalberti (2001) introduces the alternative term of an ultra-safe industry, as being an industry with a risk of accident below one accident in every 100,000 safety units or even below one accident in 1million safety units. Industries achieving this level of safety include the nuclear industry, scheduled civilian flights and railways (at least in Europe). Ultra-safe design and operation is a key aim of all safety-critical industries. This overarching focus on safety can however manifest itself in extensive procedures and regulations, the use of conservative and often aging technology and often rigid and bureaucratic hierarchies of control. (Amalberti, 2001; Kettunun et al., 2007).

Safety-critical industries possess two key properties; those of mass and dread (Wears, 2012): mass in the sense that there is potential for large numbers of people to be killed or injured simultaneously in the event of an accident. And dread as individuals perceive that the dangers are outwith their locus of control, and they are unwilling to be exposed to them. An accident at a nuclear power plant has the potential to exhibit both these properties, as does a civil aviation disaster.

The civil nuclear and aerospace sectors are therefore two archetypal examples of highly-regulated safety-critical industries. The nuclear industry is concerned with the design, operation, maintenance and eventual decommissioning of nuclear power plants. Each of these activities is closely supervised by powerful external regulators whose mission is to ensure the safety of all nuclear operations. The civil aerospace industry operates along similar lines: a tight knit community of engineering and commercial personnel deliver complex products to the aviation industry, where errors and failure can lead to huge financial and reputational damage.

The highly-regulated yet profit-driven nature of safety-critical industries is a source of a number of tensions and trade-offs which impact projects in these sectors. The most obvious tension at play is the challenging of balancing the economic imperative to generate profits versus the overriding priority of maintaining a safe operation (Baron & Pate-Cornell, 1999; Perin, 2005; Kettunun et al., 2007 and Reiman &,Rollenhagen, 2012). Amalberti (2001) frames this tension as safety versus performance, but he is articulating the same thing – the challenge of delivering a high-performing, economically-efficient system such as a global airline company or a nuclear power plant whilst maintaining safety at all times. In her seminal anthropological study of nuclear power plants in the United States, Perin (2005) calls this tension a trade-off quandary. This particular quandary is centred on the pressure to keep a nuclear plant operating and therefore revenue generating against the required maintenance activities to keep said plant safe.   She eloquently captures the challenge of the economics vs. safety trade-off for all who are involved in the nuclear industry as follows “to enter into the working life of a nuclear power generating station populated by those who shoulder its risks on behalf of us all is to enter into a world where electricity is a by-product: a [nuclear power] station’s primary product is a cultural commodity: civic and market trust in its managers and experts competencies” (Perin, 2005, p265). Somewhat alarmingly Perin (2005) also draws on statements made by the World Association of Nuclear Operators (WANO) that the number of safety incidents or near incidents in nuclear power plants has increased in recent years. The reasons proffered for this disturbing trend are firstly, “negligence in cultivating safety culture due to severe pressure to reduce costs following the deregulation of the power market” (Perin, 2005, p10), and secondly tensions in the perceived culture of control in the industry.

This leads us into the second key tension in safety-critical industries; that of centralisation versus decentralisation in decision making authority.  Kettunun et al. (2007) capture this tension in the context of nuclear power “The challenge with nuclear power generation is that the process to be controlled (i.e. nuclear fission) may be characterised by both tight couplings and complex interactions. In terms of the distribution of decision-making authority, the theoretical paradox is as follows: centralisation is effective because it contributes to the management of tight couplings by providing the means for a fast and co-ordinated response, while decentralisation is effective because it makes it easier to cope with uncertain situations by empowering those who have the best possible knowledge, skills and position for their management .” Kettunun et al., 2007, p427.

In certain circumstances in safety-critical environments, for example steady-state flight conditions, or regular nuclear plant maintenance operations the need for systematic and routinised approaches to work schedules ensure that centralisation and standardisation gain the upper hand in this trade-off quandary. However as uncertainty increases, perhaps in the presence of conflicting flight indicator signals during an otherwise routine journey, or the sounding of an equipment alarm signal in a nuclear plant, this may dictate a response that is decentralised, empowered and draws on the skills of those closest to the incident in question. Perin (2005) frames this trade-off quandary as one between a culture of “command and control” and “doubt and discovery” and argues for more of the latter in the safety-critical, pressure cooker environment of an operating nuclear power plant.

The third tension noted by a number of authors in safety-critical industries is that between redundancy and complexity (see for instance Ripjma, 1997; Kettunun et al., 2007). In order to keep nuclear plants and aircraft engines safe, their designers build in redundancy, but paradoxically this very redundancy may challenge safe operation as it adds additional complexity, makes systems less transparent, and can give a false sense of security in safety margins. Perrow (1984) argues in Normal Accident Theory that accidents will happen and indeed are inevitable in complex, technological systems that are tightly coupled. This complexity “inevitably yields unexpected interactions between independent failures. By means of tight coupling, these initial interactions escalate rapidly and almost unobstructedly into a system breakdown.” (Ripjma, 1997, p15). Additional redundancy may be beneficial, and indeed essential in specific elements of safety-critical systems, but the challenges for designers and operators is to limit the increase in complexity and tight-coupling caused by the additional redundancy.

A final tension, particularly in the civil nuclear sector, is one between the licensee and regulator’s divergent views on safety regulation and the volume and appropriateness of required safety documentation. Safety cases in the United Kingdom are a good example of this as they are a mandatory regulatory requirement but can be viewed as adding additional project cost but delivering little in terms of additional safety for the project (Kettunen et al., 2007). The regulators can also be perceived as unduly negative towards new technology, leading to the conservative use of tried and tested technology, and preventing licensees benefiting from technological advances and good industry practices developed elsewhere (Kettunun et al., 2007).

Whilst the tensions and tradeoffs are very real and present to those individuals tasked with keeping the civil nuclear and aerospace industries safe, the question we need to address next is what are the implications for project management professionals who are tasked with delivering safety-critical projects in these pressure cooker environments…..I am out of words here, so this will have to be addressed in a future post.

References

  1. Amalberti, R. (2001) The paradoxes of almost totally safe transportation systems Safety Science, 37(2001), 109-126.
  2. Baron, M.M. & Pate-Cornell, M.E. (1999) Designing risk-management strategies for critical engineering systems IEEE Transactions on Engineering Management, 46(1) 87-100.
  3. Falla, M. (1997) Advances in Safety Critical Systems – Results and Achievements from the DTI/EPSRC R&D Programme.  Retrieved on 21st October, 2012 from http://www.comp.lancs.ac.uk/computing/resources/scs/
  4. Kettunun, J., Reiman, T. & Wahlstrom, B. (2007) Safety management challenges and tensions in the European nuclear power industry Scandanavian Journal of Management, 23 (2007), 424-444.
  5. Kontogiannis, T. (2011) A systems perspective of managing error recovery and tactical re-planning of operating teams in safety critical domains Journal of Safety Research, 42(2011), 73-85.
  6. Perin, C. (2005) Shouldering Risks: The Culture of Control in the Nuclear Power Industry. Princeton University Press, New Jersey, USA.
  7. Perrow, C. (1984) Normal accidents: Living with high risk technologies. Basic Books, New York.
  8. Reiman, T., & Rollenhagen, C. (2012) Competing values, tensions and trade-offs in the management of nuclear power plants Work 41 (2012), 722-729.
  9. Rijpma,J.A. (1997) Complexity, tight-coupling and reliability: connecting normal accident theory and high reliability Journal of Contingencies and Crisis Management, 5(1), 15-23.
  10. Wears, R.L. (2012) Rethinking healthcare as a safety–critical industry Work 41(2012), 4560-4563.

Leave a Reply

Your email address will not be published.