Cart (Loading....) | Create Account
Close category search window
 

IEEE Quick Preview
  • Abstract

SECTION I

INTRODUCTION

AS SOCIOTECHNICAL systems (e.g., power stations, refineries, airliners) become increasingly complex and automated, focus has increased on the human contribution to failure. The view of many is that the solution to problems that arise from human error is to increase automation. However, this viewpoint is indicative of a misunderstanding of the capabilities, strengths, and resilience of humans, as well as a further misunderstanding of the strengths and weaknesses of current and near-future automation systems.

Section II provides an introduction to culture, its effects on behavior and its measurement; Section III describes the effects of culture on safety-related behavior, and presents evidence from the literature. Section IV introduces some current views on thinking and decision making, the effects of culture on these and the implications for decision making under pressure. Section V presents several automation-related issues and their implications, Section VI suggests future approaches to complex sociotechnical system design, and Section VII summarizes the paper's main points and draws conclusions.

SECTION II

BRIEF INTRODUCTION TO CULTURE

A. What is Culture?

The term “culture” relates to the values, assumptions, preferences, beliefs, rituals, knowledge, skills, and behaviors that are shared at the level of the social group (or society). Culture, in particular ethnic culture (see later), is typically acquired unconsciously and as a result, most people are not aware of their own culture until they experience some form of “culture clash,” and begin to realize that their assumptions, their views of the world, are not shared by everyone.

B. Why do Human Groups Develop Culture?

The values, assumptions, etc., of a social group's culture evolve over multiple generations, and typically improve that group's fit with the environment. The acquisition of this “culture” provides individuals within that group with that “improved fit,” without requiring them to expend the time or take the risks that are associated with learning from direct experience.

Significant cultural changes can occur between generations, enabling adaptation at a much faster rate than can occur via genetic changes, which typically require millennia. As humans have changed their environments at an increasing pace, the ability to change their cultural “firmware” has become increasingly important. However, in periods of very rapid change, e.g., due to globalization, urbanization, or migration, a group's culture can become a source of disadvantage because it cannot adjust quickly enough. This may lead to group impoverishment, fragmentation, and conflict.

C. Forms of Culture

There are many forms of (and aspects to) culture, but the authors will concentrate in this section on those forms of most relevance to the issues covered in this paper.

Ethnic (or National) Culture

Ethnic culture is acquired from birth via immersion in the social environment, and is the most important form of culture due to its power and persistence. It has been identified as a key factor in many sociotechnical system accidents due to its effects on crew or operator communication and behavior. National culture is often used as an approximation to ethnic culture because the majority of published data are collected at the national level; this can introduce inaccuracies in countries where multiple distinct ethnic cultures coexist. In the remainder of this paper, the term national culture, rather than ethnic culture, will be used.

Occupational Culture

Occupational (or professional) culture emerges out of the occupation or profession that a person undertakes. Aspects of occupational culture that are important (e.g., for sociotechnical system safety) may require periodic reinforcement via training and realistic exercises in order inculcate responses at a subconscious level (see Section IV-B). As discussed later in this paper, if desirable aspects of occupational culture are at variance with an individual's national culture, then major problems may arise despite regular training.

Organizational Culture

In businesses (private and public), organizational cultures reflect the business leaders, markets, customers, products, etc. Organizational cultures can be changed, but successful change requires prolonged effort and “organizational pain.”

Employees typically adopt surface-level aspects of their organization's culture, e.g., dress and behavior, but are very unlikely to change the values that were acquired as part of their national culture. As discussed in Section III-A.3, even when the safety and success of a company is at stake, managers may not be able to overcome their own national cultural traits in order to change the organizational culture.

A beneficial “hybrid culture” can arise in a stable team where, for the purposes of (and within the boundaries of) the team, members adopt a common culture that is based on a shared understanding of each other and the team purpose. However, detailed discussions of this, and related issues such as team fragmentation, are beyond the remit of this paper.

Safety Culture

Culture, in particular organizational culture, has been identified as a key factor in the causation of many major accidents, for example, the Chernobyl nuclear reactor [1], Challenger [2], Columbia [3] shuttles, Herald of Free Enterprise passenger ferry [4], and Nimrod aircraft [5]. Therefore, an effective safety culture is of key importance to high reliability sociotechnical organizations.

Safety culture is influenced primarily by the leadership provided by an organization's management; it is, therefore, heavily influenced by organizational culture and the national cultures of the managers. Guldenmund [6], following a detailed review of safety-related literature, defined safety culture as “those aspects of organizational culture which will impact on attitudes and behavior related to increasing or decreasing risk.” As a facet of organizational culture, safety culture requires that the latter to be addressed in order to elicit long-term safety improvements. A key factor in the performance aspects of safety culture relates to the effectiveness of training, in particular the degree to which it transfers to the working environment. As discussed later, training effectiveness is also affected by national culture.

D. Measurement of Culture

In order to assess the effects of culture on performance, we need a means of measuring the cultures of the people who work in the organization or system. Such measures cannot capture detailed cultural knowledge and assumptions, but can capture certain culture-related traits. Many frameworks have been developed to capture the traits of cultural groups at the national level, for example, those of Gallagher [7], Schwartz [8], Triandis [9], Hofstede [10], Hampden-Turner and Trompenaars [11], and House et al. [12].

The selection of a national cultural framework for the authors’ research work was based on four factors— that the framework provided quantitative scales of reasonable precision, that it provided national cultural trait values for a wide range of countries, that it had been extensively evaluated, and that it had also been widely applied in published studies. On the aforementioned basis, Hofstede's cultural framework was chosen by the authors; see Table I for brief descriptions of the framework's four original dimensions. These dimensions have since been supplemented by two further dimensions—short-/long-term orientation (STO/LTO) and indulgence versus restraint [13]; because values for most countries were not available until recently, the authors of this paper have not, as yet, utilized these two additional dimensions.

Table 1
TABLE I HOFSTEDE'S CULTURAL DIMENSIONS

Neither Hofstede's framework, nor any other, represents a comprehensive model of human culture; these frameworks are typically based on statistically derived correlations, rather than proven causal relationships, and individuals within cultures may vary widely. Nevertheless, cultural frameworks enable us to capture persistent trait values that have predictive value with regard to group or team behaviors.

E. Further Culture-Related Issues—Face Maintenance

The maintenance of face, both internally (self-image) and externally (public image), is important in all cultures. However, in collectivist [low individualism (IDV)] cultures, the maintenance of public face is vital as it affects the status of the individual and his/her social group. In a study of over 1000 Chinese respondents carried out by the China Daily Youth [14], 75% identified making a mistake in public as by far the most humiliating experience they could have. Pedersen [15, p. 149] commented that face can become more important than life itself in Asian communities, as one's identity is based on the community's evaluation. As a result, the threat of loss of face (to oneself or to an important other) can delay communications at critical times and potentially cause major accidents.

F. Summary

This section has presented four forms of culture. Of particular concern to the safety of sociotechnical systems is the pervasiveness of national culture that, in times of stress, may dominate communications, decisions and behavior.

This section has also introduced a framework for the quantitative measurement of national culture traits; such quantization enables meaningful relationships between culture and performance to be identified, as described in the next section.

SECTION III

INTERACTION OF CULTURE AND SAFETY PERFORMANCE

In this section, the authors utilize the cultural dimensions of Hofstede's framework as a basis for examining the literature on culture-mitigated team performance. Because of space limitations, the authors have concentrated primarily on the aerospace domain, but would recommend readers also to examine relevant papers in the nuclear and other safety-critical industries, for example, those by authors such as Reiman, Lee and Harrison, Pidgeon, and Cox.

Strauch [16] identified two ways in which culture can affect the safety performance of teams—intrateam communication, and situation awareness and decision making. The authors discuss these and other factors below before presenting evidence of the effects of culture on safety performance.

A. Effects of Culture on Team Performance

Intrateam Communication

In high power distance (high PDI) societies, the senior person in a team or group is expected to possess all the knowledge relevant to his or her position, even though this is unlikely in situations where significant specialization occurs. Decisions are made autocratically and implemented quickly (due to lack of consultation) and levels of subordinate-initiated communication are low [17]. In low PDI societies, authority for most decisions is typically delegated to those with the relevant knowledge, and communication typically flows freely up and down the formal hierarchy. Most societies score somewhere between the two extremes.

Helmreich and Merritt's 23 country survey of commercial airline staff revealed that low PDI crews were willing to express their views to their captains, whereas high PDI crews found this difficult, even when there was a safety issue [18].

In high IDV societies, team members typically communicate in a direct, low context manner, where the intended meaning is in the message. In low IDV (high collectivism) societies, team members typically communicate in an indirect, high context manner, where only a small part of the meaning is in the message itself, the remainder is inferred from contextual references and preexisting knowledge [19]. If a low IDV subordinate detects an error on the part of his or her superior, he or she must enable the superior to make the discovery; this maintains harmony and avoids loss of face (see Sections II-E and IV-C). However, such an error-discovery process can take a considerable period of time, and can involve the attention of several members of the team (e.g., aircrew) at a critical time.

Cultural diversity amongst team members typically reduces intrateam communication further, and increases the potential for members to misunderstand each other.

To summarize the aforementioned, the combination of high PDI and low IDV typically results in a much-reduced rate of intrateam communication.

Situation Awareness and Decision Making

Joy and Kolb [20] investigated the effects of cultural dimension scores on preferred learning styles. They reported that both students and teachers in high uncertainty avoidance (UAI) societies preferred abstract conceptualization and reflective observation, whereas those in low UAI societies were comfortable with concrete experience and active experimentation, e.g., training exercises with realistic role play. Support for these results can be seen in research by Burke et al. [21].

When running complex sociotechnical systems, personnel from high UAI societies tend to follow standard operating procedures (SOPs) more closely than do personnel from low UAI societies, thus reducing the likelihood of errors ; however, they tend to continue following these SOPs when they are no longer relevant to the situation. Vincent and Dubinsky [22] examined the responses of students from USA (low-to-medium UAI) and France (high UAI) when faced with threat situations. They reported that the French students exhibited more maladaptive coping than did the USA students. Klein et al. [23] stated that tolerance for uncertainty influences the threshold for initial reaction to an anomaly; this is because detection of change takes place when the observer mentally reframes his/her understanding of a situation (see Section IV-A). Low UAI (high tolerance for uncertainty) personnel change to a new initial understanding (i.e., reframe) with less information than do high UAI personnel; however, this lower information threshold may lead to more false alarms.

To summarize the aforementioned, high UAI encourages operators to follow SOPs, which is beneficial to sociotechnical system safety; however, high UAI can hamper decision making in response to rare emergencies due to delayed reactions (unwillingness to reframe) and reduced situation awareness resulting from lack of realistic training.

Sociotechnical System Safety Culture and Safety

In this section, we look beyond Strauch's team performance factors and consider the overall organization.

Reason [24] made it clear that an effective safety culture is dependent on the willing and active participation of the workforce. Participation is less likely in high PDI societies because workers are unwilling to challenge or question their superiors; in addition, low IDV workers will not raise safety issues unless supported by their whole group. However, surveys by Mearns and Yule [25] of six national groupings of workers in the oil and gas industry indicated that the greatest predictor of workforce attitudes to safety and risk taking was workers’ perceptions of management attitudes to safety.

It is, again, worth examining the air transport industry, as this has led the way in many aspects of safety improvement, and millions of hours of data have been gathered. Air transport accidents are relatively rare and most accidents happen due to a conjunction of factors, some of which may have been present for a significant period of time.

The U.S. Aviation Safety Reporting System (ASRS) [26] started operating in 1976. It takes inputs relating to deviations, errors, incidents, and accidents from all air transport staff, including aircrew, air traffic controllers and mechanics, and is relevant to both general aviation and commercial carriers. The ASRS affords a high degree of anonymity and, because of this, receives more than three thousand reports per month that form the basis of detailed analyses and recommendations. The program has been successful in reducing accidents and incidents, and in improving education and training. Following the success of the ASRS, the Aviation Safety Action Program (ASAP) was created specifically to address U.S. commercial air carrier safety [27]. Similar safety programs have been developed elsewhere, e.g., CHIRP [28] and BASIS [29]. Where they have been effectively implemented, these programs have made significant contributions to the reduction of serious aircraft accidents (in terms of losses per million flights), and to the improvement of training and SOPs.

The success of any voluntary safety reporting program depends on punishment of operators or crew being regarded as a last resort. The Taiwanese Aviation Safety Council (established in 1998) implemented the TACARE flight crew voluntary incident reporting system, which was based on CHIRP. However, a survey that was carried out by Lee and Weitzel in 2003 [30] revealed that participation in TACARE was low, and only limited data were generated from it due to the Chinese authoritarian, punishment culture.

To summarize the aforementioned: In authoritarian (typically above average PDI, low IDV) cultures [31], it is difficult to create the climate of trust that is a prerequisite to the collection of data on human performance deviations and nonfatal incidents in complex sociotechnical systems. As a result, safety improvements based on modifications to processes, systems, training and SOPs are more difficult to achieve. This is due primarily to the company organizational environment, rather than to the operator (or cockpit crew) team environment.

B. Evidence of the Relationships Between Culture and Sociotechnical System Safety Performance

Commercial Airlines

Jing et al. [31] reported in 2001 on a statistical analysis of aircraft crash data versus cultural traits. They identified the primary cultural variable (positively linked to crash rate) as authoritarianism (above average PDI, low IDV). For example, a high authoritarianism crew member will chose adherence to the captain's wishes over adherence to standard procedures; also, if the captain is uncertain as to how to do something, he cannot ask a crew member because that would expose his lack of knowledge. Jing et al. described a China Airlines crash that occurred due to an erroneous triggering of the takeoff/go-around lever; neither the first officer nor the captain would admit that they did not know how to disengage the lever, and the captain continued attempting to land the plane instead of aborting the landing. Jing et al. pointed out that the whole system of civil aviation (including commercial aircraft and SOPs) has been designed by westerners, and concluded that the effects of extreme authoritarianism are not understood by western designers.

During the period between 1989 and 1999, China Airlines of Taiwan suffered six plane crashes (from a fleet of modern Boeing, Airbus, and McDonnell Douglas aircraft) [32] and a hull loss rate of 11.74 per million departures—more than 30 times that of American Airlines during a similar period [33]. As stated earlier, the TACARE flight crew voluntary incident reporting system was not a success due to cultural reasons and, between 1999 and 2003, Taiwan's commercial transport aircraft industry suffered 21 accidents and 300 fatalities—an extremely high figure for a total fleet of approximately 200 commercial aircraft.

During the period from 1989 to 1999, Korean Air suffered nine plane crashes (from a fleet of modern Boeing, Airbus, and McDonnell Douglas aircraft) [34] and a hull loss rate of 4.79 per million departures—more than 12 times that of American Airlines during the same period. In 2000, David Greenberg, formerly of Delta Airlines, was appointed to take over flight operations. He carried out a detailed evaluation of Korean Air's flight crews and identified the Korean cultural trait of (extreme) deference to authority as a key threat to flight safety; this is the problem identified by Jing et al. in mainland China and Taiwan [31]. The transcript of the cockpit voice recording that was retrieved from the Korean Air crash at Guam in 1997 [35, Appendix B] provides an example of such deference; even as the captain made several serious errors of judgment, the first officer and flight engineer could only hint to him as they were becoming increasingly aware that a crash was imminent. D. Greenberg subsequently made it a condition of service in Korean Air that all flight crews communicated in English (which does not have the many hierarchical levels of address present in Korean); this, combined with additional training, the replacement of ex-military crew with civilian-trained crew and the introduction of a promotion policy based on merit, ensured that flight crews were able to communicate much more freely with each other. There were no further crashes between 2000 and 2010.

The aforementioned Chinese and Korean cases relate primarily to Strauch's intrateam communication issue.

Note that details of individual China Airlines and Korean Air plane crashes can be obtained via hyperlinks embedded in [34] and [35].

Military Aircraft

Soeters and Boer carried out a study of NATO aircraft losses in 14 countries between 1988 and 1995 [36]. This showed strong correlations between accident rates and culture dimension scores—high PDI, low IDV, high UAI countries suffered the highest crash rates. No significant correlation was found with masculinity (MAS). NATO air forces have much in common, including similar or identical aircraft, similar regulations and operating procedures, common training facilities, and regular personnel exchanges; therefore differences in national culture scores appear to contribute significantly to the variations in accident rates.

Taking Account of Per-Capita Gross National Product

Culture is only one of several potential reasons for differences in accident rates between countries; failing to examine alternative hypotheses can lead to misleading results. Such a situation arose with two research studies that demonstrated significant correlations between aircraft accident rates and national PDI scores [37], [38]; Hofstede reanalyzed the data with the addition of per-capita gross national product (GNP) [10, p. 115] and found per-capita GNP to be the dominant variable, rather than PDI. Helmreich and Merritt [18, pp. 104–105] similarly draw readers’ attention to the potential effects on accident rates of non-cultural factors, including facility quality and government regulation.

The authors have evaluated airline accident statistics over the period 1970–2009, based primarily on data that are obtained from AirSafe.com [39]. This covers similar data to that which Hofstede reanalyzed (see above), but also includes an additional 17 years. Per-capita GNP was found to be the largest single factor in airline accident rates, as Hofstede had found. PDI showed high negative correlation with per-capita GNP, whereas IDV showed high positive correlation; both PDI and IDV were removed from the authors’ analysis, as they contributed little extra to the explanation of accident rates. The resultant statistical model accounted for 45% of crash variability; per-capita GNP was negatively correlated with crash rate, whereas UAI was positively correlated.

Note that the authors’ analysis utilized cultural scores associated with the airlines’ countries of origin; these are not necessarily the scores associated with the flight crews because many non-western airlines employ western flight crews. Therefore, these results provide insights primarily about the effects of per-capita GNP and UAI scores on maintenance regimes, training, safety cultures, and airport facility qualities. The UAI score is of particular interest to the authors due to its implications for safety, but it has to be treated with caution.

Although Hofstede showed that PDI and IDV have significant correlations with low per-capita GNP, it is clear from the earlier-described Taiwanese and Korean airlines that the combination of above-average PDI plus low IDV in the aircraft cockpit is an important direct contributor to accident rates; both Taiwan and South Korea have high per-capita GNPs of US$37 720 and US$31 714 (respectively), compared with US$36 090 for the United Kingdom [40]; all figures are based on purchasing power parity. User PDI and IDV scores (or authoritarianism scores) should, therefore, be taken into account when designing complex sociotechnical systems.

C. Cultural Effects—Summary and Further Comments

Different cultures impose differing constraints on communication, situation awareness, decision-making and behaviors of operators of complex sociotechnical systems. High PDI, low IDV, and high UAI scores all tend to adversely affect safety culture and lead to increased risks of accidents, in particular when combined. Fig. 1 summarizes the main associations between culture scores and sociotechnical system team performance under stress (e.g., due to an emergency).

Figure 1
Fig. 1. Summary of the key relationships between cultural dimension scores and sociotechnical team performance under stress.

The solid arrows indicate positive relationships (increasing value leads to increasing value) and the dashed arrows indicate negative or inverse relationships.

The designs of most complex systems (e.g., nuclear power stations, refineries, airliners, nuclear submarines) are based on the concepts and cultures of low PDI, high IDV, low UAI Anglo and North European countries. As a result, the operational environments (e.g., aircraft cockpits, refinery control rooms), SOPs and training programs associated with these sociotechnical systems incorporate inbuilt cultural assumptions (e.g., the rapid flow of factual information between all team members); these clash with the national cultures of many operators, resulting in increased accident rates in those cultures.

SECTION IV

PSYCHOLOGICAL/ANTHROPOLOGICAL BACKGROUND

This section presents a brief overview of current theories of universal human cognition that provide insights into culture's effects on perception, communication, and decision making.

A. Categorization and Schemas

From a very early stage of development, humans unconsciously develop categories of concepts. Many researchers, e.g., Lakoff [41], believe that categorization is the basis of all thought processes, i.e., until we have categorized an object or concept, we cannot perceive it properly. Humans build high-level mental schemas (frames, scripts) populated with categories and/or subschemas and their relationships in order to represent the world based on beliefs and prior experiences [42]; schemas are also templates for behavior [43]. As individuals interact with familiar environments, they effortlessly project, combine and populate these schemas in order to perceive and react to objects and events around them; when excessive dissonance arises between schemas and reality, alternative schemas are substituted (‘reframing’ occurs). Education, training, and experience result in occupations whose members, for example, doctors, professional pilots and engineers, develop very detailed, specialized schemas.

Although cognitively efficient, schemas provide expectations that may bias information collection and may also result in false recall of past events [44].

B. Dual-Process Theories

Although there remains much controversy in the details, there is considerable evidence to support dual-process theories of human thinking and decision making [45].

“Type-1” thinking is largely unconscious, almost effortless, automatic, heuristic, schema-based, parallel, fast, capable of integrating large amounts of data, and weighting many alternative potential outcomes, but is relatively inflexible. Type-1 processing depends on the earlier-described schemas in order to develop procedural automaticity when carrying out complex tasks or decisions. “Type-2” thinking is analytic, conscious, effortful, deliberative, flexible, serial, slow, directly influenced by culture, limited by working memory and can utilize only small amounts of data; note that type-2 thinking may ‘call in’ some type-1 processes. Evidence from the research of cultural psychologists identifies significant differences in type-2 thinking between low IDV Eastern Asians (holistic, contextually sensitive) and high-IDV North Americans and Europeans (analytic, contextually desensitized) [46]. These culturally learned differences also influence automatic type-1 thinking processes, for example, those processes associated with scanning scenes and interpreting nonverbal signals. Frequent repetition leads type-2 processes to become type-1 processes (e.g., as in car driving). In the case of sociotechnical system crews or operators, regular, realistic team-based emergency response training exercises enable the development of appropriate, detailed schemas that are available to type-1 processes. When an emergency occurs, such appropriately trained operators or crew will react in type-1 mode, retrieving the relevant mental schemas, perhaps with minor adaptations, and applying the rehearsed procedures associated with these. While doing this, they will also have type-2 processing capacity available, enabling them to start to build up a mental picture of the causes of the emergency and to consciously plan further actions to take when they have stabilized the system. Inadequately trained operators will not have appropriate mental schemas, will typically suffer cognitive overload, and will be more likely to panic or to become passive and resign themselves to the consequences. As stated in Section III-A2, crew/operator training effectiveness is affected by cultural traits, in particular, UAI.

C. Emotions

Emotions play a major part in perception and decision making, providing a basis for evaluating outcomes of potential decisions. Emotions drive us to avoid negative affect (“bad feeling”) situations and to seek or maintain positive affect (“good feeling”) situations. This drive occurs at the level of automatic responses and, via emotion schemas (which involve higher order cognition) [47], also occurs at the level of complex decision making; both of these levels are affected by culture. Chronically accessible emotion-related schemas that are based on cultural norms [48], for example, those relating to face in Eastern cultures, can cause individuals to delay communicating essential information, with potentially disastrous results; this is probably a major contributory cause to the Korean air crash example described in Section III-B1. Fig. 2 presents a simplified flow diagram of perception and decision making at the individual level.

Figure 2
Fig. 2. Individual decision making under pressure.

D. Summary re Psychological Aspects

In highly stressful, time-critical situations, effective perception, and appropriate initial responses rely on heuristic, schema-based type-1 mental processes; the relevant schemas for these situations can only be developed via regular realistic emergency-response training and/or simulation exercises; this training is less likely in high UAI societies. In addition, culturally modulated chronic emotion schemas may result in delayed communications during time-critical situations.

SECTION V

PROBLEMS ASSOCIATED WITH INCREASING AUTOMATION

Sociotechnical systems are becoming increasingly automated. However, where the consequences of failure are severe and the modes of failure cannot all be predicted, automation systems must be backed up by humans.

Automation brings changes to the activities, workloads, situation awareness, and skill levels of human operators or crew. Because of space limitations, these changes are considered below with particular reference to the aviation environment. However, automation is similarly affecting other complex sociotechnical systems including nuclear power stations, oil and gas refineries and platforms, ships, and the other system-of-system elements that are essential to ensure operational effectiveness and safety (e.g., air traffic control).

A. Transformation from Operator to Monitor

Current automation systems, e.g., flight deck automation, place a requirement on crews to perform a passive, “outside the loop” monitoring role, rather than an active, “doing” role. Problems that are associated with passive monitoring have been recognized since World War 2 (e.g., in the performance of radar operators); laboratory tests carried out more than 60 years ago confirmed that a major drop in vigilance typically occurred after approximately half an hour [49]. Since then, many further studies have confirmed this monitoring problem, in particular, where a system is highly reliable [50]. As loss of vigilance reflects a universal human limitation, accidents that have resulted from passive monitoring failures should not be regarded as due to human operator errors; rather, they are due to automation system design errors.

B. Workload

Automation typically reduces crew mental workload during low-load periods, but increases it during high-load periods [51]. This workload problem is exacerbated when problems arise that impose further mental workload during these high workload times, e.g., diversions or delays during an aircraft landing phase. At such times, the crew workload is typically higher than for a non-automated system, allowing any incipient failures of the automation system, or errors in its programming, to develop into a critical situation.

C. Situation Awareness

As out-of-the-loop monitors of automation, flight crews typically have reduced situation awareness of the current “flying state” of the aircraft, as well as of the detailed mode of the automation system, its constraints and its likely future behavior [52]. There are many examples of this automation-linked loss of situation awareness leading to accidents, for example, the crash of Air France flight AF447 [53].

D. Automation Complexity and Pilot Confusion

Increasing concerns were expressed during the 1990s because of the large number of incidents and accidents that resulted from pilot confusion with flight automation systems [54], in particular mode confusion [55]. In 1996, the U.S. Federal Aviation Administration (FAA) produced a report on the interfaces between flight crews and flight deck automation systems [56]. This report stated that there were significant vulnerabilities in the flight crew/automation interfaces across all transport category aircraft, and that these vulnerabilities adversely affected the crews’ situation awareness and management of automation. The FAA report also stated that the problems occurred at the system level, i.e., they could not be dismissed as isolated machine or human errors; examples given in the report included incidents and crashes due to the automation system changing flight modes without informing the pilots. Since the 1996 FAA report, the level and complexity of flight automation have increased, and clear evidence has emerged of increasing flight crew difficulties with automation. An updated FAA report is soon to be released; reports based on the draft version confirm that, in general, pilots are neither adequately trained for modern “glass cockpit” automation nor for competent manual flying when flight automation systems hand over control [57].

E. Operator/Crew Manual Skills and Training

Gillen [58] examined the basic instrument flying skills of a sample of U.S airline pilots; the average grades assigned were significantly below the minimum requirements for US Air Transport Pilot certification. Wood [59] reported on anecdotal evidence that was related to the loss of manual flying skills experienced by pilots of highly automated aircraft. Fanjoy and Young [60] reported that training and airline policies on automation could lead to pilot complacency and the deterioration of flying skills due to lack of opportunities to practice. Following a study of the performance of pilots during refresher training courses, Young et al. [61] reported that the flight crews who utilized the most flight deck automation also exhibited poorer manual flying skills than others.

Commercial pilots undertake regular proficiency checks on their flying skills, but have few opportunities to practice these skills. Ebbatson [62] reported on recent U.K. incidents and nonfatal accidents that typically occurred shortly before the pilot flying was due to undertake an operational proficiency check in a flight simulator. These incidents and accidents typically involved highly experienced crews who knowingly switched to manual control under good flying conditions. Such skill-related accidents and incidents have major implications for crew performance in abnormal situations where the automation system disengages without prior warning.

Manually operated sociotechnical systems require operators to possess high levels of process knowledge and system experience in order to operate them effectively. As the level of automation has increased, the levels of experience and skill necessary to recover successfully from a failure or unexpected situation that automation cannot handle have increased, rather than decreased. This is because there is typically a much lesser period of time during which the operator can become aware of, and react to, a developing process problem; the operator has to cope with automation-related problems as well as process-related problems.

In the cases of the ditching of a US Airways Airbus A320 on the River Hudson in 2009 [63] and the uncontained engine failure of a Qantas Airbus A380 over Batam Island, Indonesia in 2010 [64], it was only the exceptional experience, manual flying skills, and judgments of pilots and crews that enabled successful outcomes. In the case of the crash of an Air France A330 in the Atlantic in 2009 [53], the pilot flying had much less experience, in particular of manual stall recovery. As the “baby boomer” pilots who have extensive manual flying experience retire, they are being replaced by pilots who have little prospect of gaining such experience.

F. Automation–Induced Incidents and Accidents

Changes in workload, reductions in situation awareness, mode confusion, reduced operator skills, automation failures and unexpected behaviors have caused many incidents and accidents over the past three decades; see Table II for recent examples [65], [66], [67]. With the increasing complexity of modern flight decks, such events are likely to increase.

Table 2
TABLE II EXAMPLES OF INCIDENTS DUE TO AUTOMATION FAILURE

The situation following handover from automation to manual control due to extreme conditions or failure is particularly problematical; in many cases, the automation system (e.g., flight management system, FMS) will have maintained stability until it is no longer possible to do so, resulting in the aircraft going out of control as the flight crew take over. If damage has occurred or instrumentation has failed, FMS advice is often misleading or incorrect, and flight crews are typically overloaded with excess information and alarms. For example, the crew of the earlier-described Qantas Airbus A380 was faced with more than 50 simultaneous FMS alarms, including recommendations to pump fuel into badly holed wing tanks.

Current aircraft automation systems are largely unintelligent. For example, fuel movements, trimming of flying surfaces and changes to engine thrust take place with little if any assessment of the larger picture beyond the immediate trigger. In particular, logical evaluations of situations (why am I doing this?) and predictions of the likely results (what will happen if this continues?) are not generally carried out.

G. Automation as an Inadequate Crew Member

For over two decades, researchers have expressed concerns about the direction that automation, in particular aircraft automation, is heading. They have highlighted the paucity and low quality of interaction between crews and automation systems [68], and the need for multisensory feedback to crews [69]. It is now increasingly recognized that the automation element of a sociotechnical system typically acts as a poorly trained, incommunicative member of the system's crew—largely defeating the purpose of crew resource management training. In order for a crew to achieve the level of shared situation awareness required for safe operation, the automated system must become, to an adequate degree, part of the crew.

If an operator or crew member of a sociotechnical system has to make a number of unusual adjustments, he or she will typically remark on this to others, thus maintaining shared situation awareness; current automated systems may indicate adjustments (or states) on one of many dials or screens, but they do not typically draw attention to them—in part because they lack situation awareness of the “bigger picture,” i.e., of the potential ramifications. Many incidents and crashes have occurred due to this lack of communication, e.g., the China Airlines Boeing 747 loss of control near San Francisco, CA, USA [70], the China Airlines Airbus A300 crash at Nagoya, Japan [71] and the Boeing 757 crash at Cali, Colombia [72]. In each of these cases, clear communication from the automation system and, where appropriate, desisting from counteracting the human pilots’ actions, would have prevented the subsequent incidents.

H. Automation and National Culture

Sherman et al. [73] surveyed the attitudes of 5879 airline pilots from 12 nations toward flight deck automation. They listed 15 automation-related statements (e.g., “I am concerned about losing skills,” “More automation is better,” “I prefer automation”), and for each country and for each statement, calculated the percentage of pilots who agreed [73, Table IV]. Sherman et al. reported that the influence of national culture on the pilots’ agreement (or otherwise) was far greater than that of organizational culture or pilot experience. The authors of this paper carried out a further statistical analysis of Sherman et al.'s survey results against default national cultural scores, which provided additional insights. For example, the “more automation is better” scores were positively correlated with national PDI scores, and the “I prefer automation” scores were positively correlated primarily with national PDI and UAI scores. The strongest correlation appeared to be between national cultural scores and “Should always use auto”; support for this statement was strongly negatively correlated with masculinity (MAS), and positively correlated with PDI. Of the remaining statements, five were positively or negatively correlated with UAI and seven were not significantly correlated with national cultural scores. It appears, from the aforementioned survey results, that high PDI pilots and high UAI pilots are generally more positive about automation than are their lower PDI and lower UAI colleagues. The aforementioned results should be treated with caution as they are based on only 12 countries four of which are Anglo.

I. Summary of Automation Safety Issues

To-date, increased automation, in particular aircraft automation, has resulted in changes to primary crew functions (from adept “doers” to inadequate “monitors”), increases in crew workloads when unexpected events or emergencies occur, reductions in crew situation awareness and downgrading of crew “hands-on” skills. As a result, when the automation system fails, manual recovery is compromised. Developments in automation, combined with airline crew flight training policies, are resulting in an increasing gap between actual and required crew capability and situation awareness. In terms of Reason's “Safety Space” model [74], flight automation systems are moving rapidly in the direction of increasing vulnerability, as are other complex sociotechnical systems, for example, nuclear power stations and refineries.

Fig. 3 summarizes the relationships between current automation policies and performance.

Figure 3
Fig. 3. Relationships between current automation systems, training policies, and team performance in emergencies.
SECTION VI

FUTURE REQUIREMENTS OF COMPLEX SOCIOTECHNICAL SYSTEMS

It is important that future developments in sociotechnical system automation take into account the encultured human operator; some of the implications of this are discussed in the following sections.

A. Culturally Sensitive Designs

Designers of complex systems must include user (operator or crew) properties in their specifications, if they are to deliver reliable, safe systems. To-date, far from designing culturally sensitive, or even culturally neutral systems, designers have assumed that their systems will be operated by culturally similar users.

It would be prohibitively costly to produce complex sociotechnical systems that are highly optimized for individual cultures; in addition, in many cases (e.g., commercial aircraft), a particular organization may employ operators or crews from several cultures. However, research has been applied to the design of interfaces to accommodate a wide range of cultures, and this shows promise. Ford and Kotze [75] carried out statistical tests of the accuracy, speed, and satisfaction of users from various cultural backgrounds when utilizing ten test interfaces based on Marcus’ design guidelines [76].

Ford and Kotze found that interface designs for high PDI, high MAS, high UAI, and STO users were equally effective for low PDI, low MAS, low UAI, and LTO users; the aforementioned options are gray highlighted in Table III. Whereas such designs do not tackle directly the problems that are associated with team member communication (e.g., in high PDI, low IDV crews), they could contribute to increased shared situation awareness.

Table 3
TABLE III MATCHING INTERFACES TO CULTURE (BASED ON MARCUS’ GUIDELINES, SUMMARIZED TO INCLUDE MATERIAL OF RELEVANCE TO THIS PAPER)

The authors have developed tools that assess the degree of fit between operator culture and sociotechnical system [77]. These can provide guidance as to the aspects of the operator tasks that are at odds with their cultures.

B. Automation as a Crew Member

Studies have demonstrated that crew performance under time pressure improves with more naturalistic automation-crew interactions. For example, Thomas [78] evaluated pilots’ abilities to deal with non-normal flight deck events when automation assisted to various levels with checklists and used voice messages to communicate to them. Pilot workload ratings were reduced when they interacted with the automation similarly to how they interacted with their copilots; pilots also preferred this level of automation, partly because it provided insights into automation activities.

Klein et al. [79] proposed ten challenges that automation must tackle in order to become a team player, including modeling crew members, collaborating and managing attention. It is clear from such challenges that a significantly higher level of intelligence than is currently practicable will be required to fully meet the goal of “automation as a team player.” However, as an interim step, the automation-crew member, when acting as “pilot-flying,” should be expected to keep the human crew informed of its activities and modes, to actively warn the crew before mode changes and to highlight any discrepancies or unusual occurrences. The automation system should also have sufficient intelligence to detect or predict problems and inform the human crew in advance of “throwing in the towel”; see Table IV for examples.

Table 4
TABLE IV EXAMPLES OF AUTOMATION INTELLIGENCE

Verbally communicating problems such as those in Table IV to the crew would ensure shared situation awareness prior to the need to take corrective action, and would also change the crew members’ operating mode from passive monitoring. Such an interaction capability demands a higher order of computer “awareness” of flying-related activities and automation activities, more than is the case with current automation systems. However, a separate intelligent system that has “read access only” interfaces to FMS status, plans, and flight data could be developed and improved, without requiring complete redevelopment and recertification of the current automation technology.

Culture further complicates the development of the “automation-as-a-crew-member” concept. Crews of different nationalities have differing automation-related preferences (see Section V-H), and may require differing interactions with automated crew members; for example, high PDI crews may prefer authoritative (command orientated) automated crew members, whereas low PDI crews may prefer them to adopt a more advisory role. A long-term goal would be to incorporate cognitive models of encultured human crews into the automation system in order to enable effective interaction.

C. Crew Training

Manually operated sociotechnical systems require high levels of operator process knowledge and system experience for safe and effective operation. As the level of automation has increased, the experience and knowledge necessary for a successful recovery from a failure or unexpected situation has increased, rather than decreased. The new multicrew pilot's license (MPL) training framework offers practical training improvements, as it specifies in detail realistic simulator-based training [80]; it also aligns with requirements for simulator-based refresher courses and “recovery from failure” exercises that inculcate the mental schemas for prompt corrective actions when faced with sociotechnical system failures.

SECTION VII

CONCLUSION

This paper presents evidence of the effects of cultural traits (as expressed in terms of Hofstede's framework) on intrateam communication, situational awareness, and decision making, and the consequent effects on sociotechnical system accident rates. The authors examined problems that are associated with current approaches to airliner automation, where there is often a dichotomy between system designer cultures and system user cultures, leading to errors, misunderstanding, and in the worst cases, system failures. This dichotomy is shown to be compounded by current design trends that ignore universal human limitations associated with the passive monitoring of reliable systems. This leads to reductions in operator/crew skills and reduced shared situation awareness, resulting in severely compromised abilities to recover from automation failures.

Pilots with high manual flying experience are now retiring, giving extra urgency to improved training (e.g., based on the MPL framework). Although some piecemeal developments can usefully be made to improve the intelligence of automated systems and move toward automation as a crew member, a long-term solution would be to incorporate cognitive models of encultured crew members in the automation system; this would require in-depth cross-disciplinary research. The aim is to make the automated system a cooperative, culturally sensitive, effective member of the sociotechnical system crew.

ACKNOWLEDGMENT

Research data for this paper is available on request from A. Hodgson.

Footnotes

This work was supported by the U.K. Engineering and Physical Sciences Research Council. This paper was recommended by Associate Editor J. Marquard.

The authors are with the School of Electronic, Electrical and Systems Engineering, Loughborough University, Loughborough LE11 3TU, U.K. (e-mail: a.hodgson@lboro.ac.uk; c.e.siemieniuch@lboro.ac.uk; e.hubbard@lboro.ac.uk).

Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org.

References

No Data Available

Authors

Allan Hodgson

Allan Hodgson

Allan Hodgson is currently working toward the Ph.D. degree with the School of Electronic, Electrical and Systems Engineering, Loughborough University, Loughborough, U.K.

His current research interests include the effects of culture on the performance of teams. His past research interests include technology road mapping, next generation manufacturing, systems integration, discrete simulation and production planning & control.

Carys E. Siemieniuch

Carys E. Siemieniuch

Carys E. Siemieniuch received the B.A. degree in French Studies from Manchester, U.K., in 1974, the M.Sc. degree in information technology in 1987 and the Ph.D. degree in organizational systems design in 2005, both from Loughborough University, U.K.

She is currently a Professor of systems engineering with the School of Electronic, Electrical and Systems Engineering, Loughborough University, Loughborough, U.K., and has worked as a Systems Ergonomist for more than 25 years. She has developed new understandings about tacit knowledge, systems design, impact of cultural factors, enterprise system modeling, and emergent system behaviors.

Ella-Mae Hubbard

Ella-Mae Hubbard

Ella-Mae Hubbard received the M.Eng. degree in systems engineering from Loughborough University, U.K., in 2005.

From 2005 to 2008, she worked as a research associate with the Department of Human Sciences at Loughborough University on the Knowledge and Information Management Through Life Grand Challenge. Alongside this post, she also completed her Ph.D., investigating the configuration of decision-making systems. She joined the School of Electronic, Electrical and Systems Engineering, Loughborough Universit, Loughborough, U.K., as a Lecturer, in 2008. Her research interests include organizational systems, decision-making, accident investigation, and the role of humans. She also has pedagogical research interests in the methods of teaching and training systems engineering and associated disciplines.

Cited By

No Data Available

Keywords

Corrections

None

Multimedia

No Data Available
This paper appears in:
No Data Available
Issue Date:
No Data Available
On page(s):
No Data Available
ISSN:
None
INSPEC Accession Number:
None
Digital Object Identifier:
None
Date of Current Version:
No Data Available
Date of Original Publication:
No Data Available

Text Size


Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.