A Just Culture in Safety Reporting

  • Home 2004 A Just Culture in Safety Repor....

A Just Culture in Safety Reporting

43RD ANNUAL CONFERENCE, Hong Kong, China (SAR), 22-26 March 2004

WP No. 168

A Just Culture in Safety Reporting

Introduction

At the 42nd Annual Conference in Buenos Aires SC4 was tasked with investigating Non-Blame Culture (NBC) and voluntary reporting. During the SC4 proceedings it became apparent that the title “non-blame culture” had to change to “A just culture in safety reporting”.

This paper will validate a just culture in safety reporting and formulate policy for its implementation.

Discussion

Our Opinion

IFATCA policy on page 4 4 2 3 paragraph 2.2 Air safety reporting systems is that:

“Whereas IFATCA thinks a Voluntary Reporting System is essential, MAs should promote the creation of Air Safety reporting systems and Confidential Reporting Systems among their members”

 

Additionally,

“IFATCA shall not encourage MAs to join Voluntary Incident Reporting System unless there is a guaranteed immunity for the controller who is reporting.

Any voluntary incident reporting system shall be based on the following principles:

a)  In accordance and in co-operation with the pilots, air traffic controllers and ATC authorities;

b)  The whole procedure shall be confidential, which shall be a guaranteed by law;

c)  Guaranteed immunity for those involved, executed by an independent body.”

 

At the 11th ICAO Air Navigation Conference, IFATCA presented a paper under agenda item 2.1 “Safety Management Systems and Programs”, titled “The need for a just culture in aviation safety management”. All recommendations of this paper, see below, were accepted by ANC11. Point 1 gives a clear definition of what IFATCA understands as “Just Culture”.

“Action by the Conference:

1. The Conference is invited to task ICAO with developing guidelines promoting the concept of a “just” culture; i.e. a culture in which front line operators are not punished for actions or decisions that are commensurate with their experience and training, but also a culture in which violations and wilful destructive acts by front line operators or others are not tolerated.

2. The Conference is invited to task ICAO to ensure that judicial aspects are adequately addressed in ICAO guidance material on Safety Management Systems for ATS.

3. The Conference is invited to encourage ICAO Member States to:

a) review existing aviation laws with the aim to remove factors that could be deterrents to the collection and analysis of valuable safety-related information; and/or

b) develop legislation that adequately protects all persons involved in the reporting, collection and/or analysis of safety-related information in aviation.”

Accidents with serious consequences are difficult to hide and reporting accuracy is likely to be good for such events. In general, the reporting of accidents with minor consequences is likely to be less complete. Air proximities or lesser incidents are often not reported. In the absence of conflict detection systems, either ATC Radar based or TCAS, the responsibility to report the incident rests solely with the controller.

The past performance of an organisation with regard to accident and incident investigation could have a significant bearing on the controller’s input to the reporting program. It is generally acknowledged that the seriousness of the outcome of a dangerous event is often a matter of chance. Similar circumstances may in one case lead to multiple fatalities and in another case to an outcome of no consequence. Only by learning from the greatest variety and volume of data can safety performance be monitored and continually improved.

If an organisation were to base a safety program on recorded accident data it would be very limited. By its very nature, a near miss may well be an incident that can be concealed and not reported. Amongst the factors likely to discourage reporting is the perception that to do so may result in blame for the reporter or a colleague, with consequential sanctions. In a real sense, the difficulty of ensuring complete reporting is in part at least, a function of an organizational or societal blame culture.

One key area to achieve an effective Safety Management System (SMS) is to cultivate a level of “just culture” within a non-punitive environment, by ATM service providers, ATM safety regulators and investigators. This effective reporting culture depends on how the organisation handles blame and punishment. A “no-blame” culture per se is neither feasible nor desirable. A small proportion of human unsafe acts is deliberate (e.g. criminal activity, substance abuse, controlled substances, reckless non-compliance, sabotage, etc.) and as such deserves sanctions of appropriate severity. A blanket amnesty on all unsafe acts would lack credibility in the eyes of employees (workforce) and could be seen to oppose natural justice.

What has to be remembered though is that only a very small percentage of controller/ pilot error can be attributed to a lack of intellectual, physical or emotional ability to do the job required. The common reason that people fail to perform is that they work in a flawed system. It is much more logical to assume the systems and processes are problematic. Human factors in a critical circumstance, can lead those involved into error. (E.g. poor information in the form of inputs, lack of clarity about the expected outputs, lack of appropriate feedback, inadequate training and possible task interference) To investigate an incident from both a systems perspective and the human factors angle would allow an organisation to collect sufficient credible data enabling it to make recommendations that guide its Safety Change Process. To simply rest blame with an individual is an expedient form of removing blame from the organisation and effectively masks a flaw.

What is needed, is an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety-related information. Even in the case of a deliberate violation, it is important to understand why it took place and whether the procedures are workable and correct and whether there were any extraneous pressures that may have contributed to the incident. It is common for controllers to modify or create ad hoc procedures over time and the consequences of their actions may not be apparent. Yet incidents can expose such procedures as a wilful violation when such actions may not have been intended as such.

With the widespread implementation of Safety Management Systems (SMS) there appears one vital omission, that of safety and incident reporting by staff. Secondly how that system uses the information when it is submitted, is equally vital. If the two NASA accidents involving the Space Shuttles are used as an example, it could be argued there was inadequate reporting by staff and equally inadequate implementation of safety enhancement programs in light of the data that has been reported. Yet the NASA Administrator, Dan Goldin, stated in address in 2001,

“There needs to be accountability with management. There should be no fear -management should not play the blame game by pointing a finger and saying, ‘you screwed up'”.

In hindsight, it would seem management in this case failed in implementing a sound system of reporting and implementation of a satisfactory safety enhancement program. How many organisations, at the management and SMS level, have that same self-image, when viewed from the top down?


Collective Aviation Opinion

The following are the expressed views of Aviation organisations on prosecution of employees and the resultant effect on safety. In a recent statement to the World’s media, the IFALPA President strongly denounced the growing trend of apportioning blame following aviation accidents. This threat of civil or criminal proceedings for violations of aviation safety laws and regulations is having a profound and damaging effect on the flow of precious aviation safety information which is essential if lessons are to be learned from accident investigations. IFALPA is supported by many prominent international organizations in its concern. Among those organizations, the following comments have been made:

ICAO – states that the accident investigation authority shall have independence in the conduct of the investigation and have unrestricted authority over its conduct. It is clear in its recommendation that any judicial or administrative proceedings to apportion blame or liability should be separate from the accident investigations (Annex 13, Chapter 5). Furthermore, the 11th Air Navigation Conference adopted recommendation 2/4 “The protection of sources of safety information” stating:

“That ICAO develop guidelines which will provide support to states in adopting adequate measures of national law, for the purpose of protecting the sources and free flow of safety information, while taking into account the public interest in the proper administration of justice. “

FAA – has asked ICAO to urge its Member States to review their laws and regulations to determine whether the possibility of criminal liability is blocking the collection and analysis of information that could provide statistics.

Flight Safety Foundation – Accidents will be prevented and further improvements in aviation safety will be gained if people, particularly pilots, are protected from punitive action. Legislation should be enacted to ensure that pilots and others in the aviation industry will be able to report voluntarily without fear of criminal charges.

Boeing – Flight and maintenance crews are often unduly exposed to blame because they are the last line of defence when unsafe conditions arise. We must overcome this “blame” culture and encourage all members of our operations to be forthcoming after any incident.

Airbus Industries – believes that data sharing will significantly increase if governments “decriminalize” aviation laws and pass legislation that adequately protects the persons involved. The Statement concluded, “all the significant partners in aviation agree that we must encourage a proper safety culture. Such a culture would not have the apportioning of blame as a characteristic. Such a culture would not indulge itself in recrimination and speculation. Such a culture would be supportive of those who commit error and tell about it so that we can all learn. Such a culture must come into place worldwide if we are to achieve the safety record that the public will demand as air traffic increases.


Current Legal Barriers

So how do we engineer a reporting culture that fosters this trust? The current domestic laws of many European countries provide for the possibility and even an obligation for administrative and/or criminal authorities, to investigate occurrences, which respond to specified criteria and to punish improper behaviours under certain conditions. In these countries certain categories of Air Navigation safety occurrences qualify as administrative or criminal offences.

A criminal investigation will, depending on the applicable law, take place:

a)  following an accident: and /or

b)  following an incident which qualifies severe in terms of concrete danger, a very subjective appreciation by nature.

Criminal prosecution will follow if there appears to be evidence that the occurrence was caused by the behaviour of an ANS operator. A criminal sentence will be pronounced if the behaviour of the ANS operator qualifies as (qualified) negligence or wilful misconduct.

Although they are triggered by a different motivation, administrative investigation and prosecution follow a similar pattern. It should be noted that, although an investigation and even prosecution, do not imply guilt and/or punishment, the process in itself is a lengthy trauma for the person experiencing it and is likely to have a strong and durable destabilisation effect on that person.

Under the legal regimes considered here, individual Air Traffic Controllers, ANS Providers and any other concerned party are under the obligation to report safety occurrences to the competent authority, which retains full discretion as to the consequential measures to be undertaken. A Safety Occurrence Reporting System may require formal changes in the domestic legal framework of the states concerned. However, provided the concept of “non-punitive environment” is well understood, these changes could be much more limited than those occasionally envisaged, such as amendments to the national criminal laws which rank extremely high in the domestic law hierarchy and are consequently particularly difficult to change.


Constraints

It is neither an obvious nor an easy task to persuade people to file reports on safety occurrences, especially when it may entail divulging their own errors, because:

a)  human reaction to making mistakes does not lead to frank confessions;

b)  potential reporters cannot always see the added value of making reports, especially if they are sceptical about the likelihood of management acting upon the information;

c)  there exist trust problems (do I get myself or my colleagues into trouble?) and fear of reprisals;

d)  no incentives are provided to voluntarily report in a timely manner.

Qualities that have been defined as essential to successful reporting systems include:

– MOTIVATION and PROMOTION: Staff must be motivated to report and the trend must be maintained.

– EASE of REPORTING: Staff must not perceive voluntary reporting as an extra task.

–  ACKNOWLEDGEMENT: Reporters like to know whether their report was received and what will happen to it, what to expect and when.

–  INDEPENDENCE: Some degree of independence must be granted to the managers of the reporting system.

–  FEEDBACK: Feedback must eventually come. Otherwise the system will die out.

–  TRUST: All of this can only happen if TRUST between reporters and the managers of the reporting system genuinely exists.


Confidentiality Aspects

The rationale for any reporting system is that a valid feedback on the local and organisational factors promoting errors and incidents is far more important than assigning blame to individuals. To this end, it is essential to protect reporters and their colleagues as far as practicable and legally acceptable from disciplinary actions taken on the basis of their reports. But there have to be limits applied to this indemnity. Some examples of where the line can be drawn are to be found in:

  • Waiver of Disciplinary Action issued in relation to NASA’s Aviation Safety Reporting System (see FAA Advisory Circular AC No. 00-46D Aviation Safety Reporting Program);
  • FAA 14 CFR part 193 – Protection of Voluntarily Submitted Information.

One way of ensuring the confidentiality protection and fulfilling the EUROCONTROL Confidentiality and Publication Policy is to be found in SRC WP.9.4 – Safety Data Flow – Progress report submitted by SDF-TF (it is not intended to reproduce it here.) The experience gained in the last three years showed that the EUROCONTROL Confidentiality and Publication Policy are functioning and States have started to gain trust in SRU/SRC. This has to be kept in mind and the reporting chains should not be jeopardised and compromised by deviation from the mentioned policy.

Additionally, two things were made clear:

a) it would be unacceptable to punish all errors and unsafe acts regardless of their origin and circumstances;

b) it would be equally unacceptable to give a blanket immunity from sanctions to all personnel that could or did contribute to safety occurrences.

Limits must be set when gross negligence, criminal activity or intent on the part of an operator is established. In other cases, the operator(s) should not be subject to administrative or disciplinary sanction only on the bases of a submitted report.

To engineer the required environment there is a need to agree upon a set of principles for drawing the line between acceptable and unacceptable actions. So where do we draw the line? How do we discriminate between the minority of “bad behaviour” and the vast majority of unsafe acts to which the attribution of blame is neither appropriate nor useful? With a complex working environment, there can never be a black or white solution.

The following is a protocol for dealing with the procedures violation. There can be several categories of violations:

a) Rule broken unintentionally: – an action where the controller was not aware that he/she was breaking the rules;

b) Routine violation: – an action that was frequent, routine, or “standard practice” amongst controller, although technically “against the rule”;

c) Exceptional violation: – a potentially dangerous course of action that was against the procedures but may have been considered necessary under the circumstances;

d) General violation: – a violation that occurred due to situational or undetermined factors and that was unnecessary.

The exemplification above is given to highlight the complexity of the situation, which the investigators might face when detecting and analysing procedure violation.


Concerns about information misuse

One of the major problems with systematically collecting and analysing large quantities of information is that information can be a very powerful tool; and like any powerful tool, it can be used properly with great benefit, or it can be used improperly and cause considerable harm.


Punishment/enforcement

Potential information providers may be concerned that company management and/or regulatory authorities might use the information for punitive or enforcement purposes. For example, a controller may feel an ATC procedure is ambiguous given different circumstances. Yet the controller may be reluctant to report his/her interpretation fearing that management may disagree and bring the controller’s technical ability into question. The more detrimental outcome would be if the controller did question the procedure and was chided for doing so, or worse still faced some form of disciplinary measure.

Such treatment causes two problems:

First, the confusing procedure will still be in use in the system, potentially confusing other controllers.

Second, and far worse, is that such treatment in effect, “shoots the messenger.” By shooting a messenger, management or the government effectively guarantees that they will never again hear from any other messengers. This, in turn, guarantees that those problems in the “unreported occurrences” part of the pyramid will remain unreported until, of course, they cause an accident or incident, whereupon the testimony at the accident hearing, once again, will be that, “We all knew about that problem.”

One aviation regulator, the United Kingdom CAA, announced some years ago, that in cases of absent egregious behaviour, e.g., intentional or criminal wrongdoing, they would not shoot the messenger, and encouraged their airlines and other aviation industry employers to take the same approach.

That is a major reason why the UK has some of the world’s leading aviation safety information sharing programs, both government and private. The type of facilitating environment created by the UK is essential for the development of effective aviation safety information collection and sharing programs.

Similarly, British Airways gave assurances that they would also not “shoot the messenger” in order to get information from pilots, mechanics, and others for BASIS. Many other airlines around the world are concluding that they must do the same in order to obtain information they need to be proactive about safety.

Significant progress has also been made on this issue in the U.S. In October 2001, the FAA promulgated a regulation, modelled on the UK example, to the effect that information collected by airlines in FAA-approved flight data recorder information programs (commonly known as Flight Operations Quality Assurance (FOQA) programs) will not be used against the airlines or their pilots for enforcement purposes, (FAA 14 CFR part 13.401, Flight Operational Quality Assurance Program: Prohibition against use of data for enforcement purposes).


Public Access

Another problem in some countries is public access, including media access, to information that is held by government agencies. This problem does not affect the ability of the aviation community to create programs such as the Global Aviation Information Network (GAIN), but it could affect the extent to which government agencies in some countries will be granted access to any information from GAIN.

Thus, in 1996 the FAA obtained legislation, Public. Law. 104-264, 49 U.S.C Section 40123, which requires it to protect voluntarily, provided aviation safety information from public disclosure.

This will not deprive the public of any information to which it would otherwise have access, because the agency would not otherwise receive the information; but on the other hand, there is a significant public benefit for the FAA to have the information because the FAA can use it to help prevent accidents and incidents.


Confidential Reporting Schemes

Several MA’s have established confidential reporting schemes. Following is an outline of the Australian model of Confidential Aviation Incident Reporting (CAIR):

The Confidential Aviation Incident Reporting program was introduced in July 1988 as a means of reporting incidents and safety deficiencies while preserving the confidentiality of the reporter. The CAIR program is open to anyone who wishes to submit an incident report or safety deficiency confidentially. It does not replace the mandatory aircraft accident and incident reporting requirements and should not be used when a notification falls under the requirements of the Air Navigation Act Part 2A s19BA and s19BC. Its focus is not on individuals, but on systems, procedures and equipment.

CAIR reports have helped to identify deficiencies or provide safety enhancement education in most areas of aviation, including flying operations, passenger and cargo handling, fuel, maintenance and servicing, air traffic services, airport security and other airport issues, communications and navigation aids, management concerns and airspace changes.


Criminal Sanctions

A major obstacle to the collection and sharing of aviation safety information in some countries is the concern about criminal prosecution for regulatory infractions.

Very few countries prohibit criminal prosecutions for aviation safety regulatory infractions. “Criminalisation” of accidents has not yet become a major problem in the U.S., but the trend from some recent accidents suggests the need for the aviation community to pay close attention and be ready to respond.


Civil Litigation

One of the most significant problems in the U.S. is the concern that collected information may be used against the source in civil accident litigation. Significantly, the thinking on this issue has changed dramatically in recent years because the potential benefits of proactive information programs are increasing more rapidly than the risks of such programs.

Until very recently, the concern was that collecting information could cause greater exposure to liability. The success stories from the first airlines to collect and use information, however, have caused an evolution toward a concern that not collecting information could result in increased exposure.

This evolution has occurred despite the risk that the confidentiality of information collection programs does not necessarily prevent discovery of the information in accident litigation. Two cases in the U.S. have addressed the confidentiality question in the context of aviation accidents, and they reached opposite results.

In one case, the judge recognised that the confidential information program would be undermined if the litigating parties were given access to the otherwise confidential information. Thus, he decided, preliminarily, that it was more important for the airline to have a confidential information program than it was for the litigating parties to have access to it. In the other case, the judge reached the opposite result and allowed the litigating parties access to the information.

As this issue will be decided in future cases, in aviation and other contexts, hopefully the courts will favour exempting such programs from the usual — and normally desirable — broad scope of litigation discovery. However, present case law is inconsistent, and future case law may not adequately protect the confidentiality of such programs. Thus, given the possibility of discovery in accident litigation, aviation community members will have to include, in their decision whether to establish proactive information programs, a weighing of potential program benefits against the risks of litigation discovery.


The Danish Solution

In 2001, after different not reported safety occurrences and investigations on this not reporting, the Danish government proposed a law that would make non-punitive, strictly confidential, reporting possible.

After the law was passed in May 2001, the Danish CAA together with the ANSP, put in place a new safety management system, based on a compulsory reporting system, guaranteeing full confidentiality as long as no accident or deliberate sabotage or negligence due to substance abuse are involved.

The 4 main pillars of the investigation system are :

a)  compulsory reporting, enabling to receive information on any possible safety occurrence;

b)  confidentially and trust against anonymously enabling to gather additional information if required and to feedback to the reporters about the report and the analysis of the event;

c)  the non-punitive nature of the system within well-defined limits granting immunity from sanctions.

d)  a correct legal framework enabling a just reporting system.

The result of this new non-punitive, confidential reporting system is a drastic increase of reports received enabling the Safety Management System to remediate shortcomings and to increase the overall level of safety.

Conclusions

The aviation industry has accepted that humans cannot be changed but nonetheless are required to make the system work safely. The legal world holds the view that the system is inherently safe and that the humans are the main threat to that safety. Safety improvements in the aviation system will be achieved as a result of an open exchange of information. Human error cannot be avoided by “designing it out of the system” or disciplining operators. Error is a normal component of human performance. This fact must be incorporated into the design, implementation and operation of complex systems where safety is the expected outcome. Air traffic management (ATM) systems are a prime example of such a complex system.

IFATCA proposes that all incident reporting schemes should contain the following principles and criteria:

  1. Mutual Trust,
  2. Openness of communication,
  3. A demonstration of care and concern,
  4. A commitment to organisational learning,
  5. A transparent management commitment to safety,
  6. An accountable and open system of information flow, i.e. Staff? Management ATM ?Staff.

Specifically, the “Incident Reporting System” must give the possibility to report in confidence rather than anonymously and provide some level of protection/immunity for the controller who is reporting deficiencies or anomalies. There should also be some level of guarantee for a fair hearing and an objective and impartial investigation that any such information given is not used for punitive or enforcement purposes. Reliable and documented follow-up should be provided with a detailed database available to all involved parties.

The Member Associations of IFATCA should play, where necessary, an important role in the process of creating the correct legal framework enabling a “just culture” in Safety Reporting.

Recommendations

In conjunction with IFATCA’s ANC 11 working paper #92, The Need For A Just Culture in ATM Management, SC4 calls upon ICAO, and others to develop guidelines promoting the concept of a “just” culture; i.e. a culture in which front line operators are not punished for actions or decisions that are commensurate with their experience and training, but also a culture in which violations and wilful destructive acts by front line operators or others are not tolerated.

That IFATCA ask ICAO to ensure that judicial aspects are adequately addressed in ICAO guidance material on Safety Management Systems for ATS, to include a provision for ICAO Member States to review existing aviation laws with the aim to remove factors that could be deterrents to the collection and analysis of valuable safety-related information.

That IFATCA Policy contained in The IFATCA Manual page 4423 paragraph 2.2 Air Safety Reporting Systems be amended as follows:

2.2. Air Safety Reporting Systems

A just culture in Safety Reporting can be defined as follows: a culture in which front line operators are not punished for actions or decisions that are commensurate with their experience and training, but also a culture in which violations and wilful destructive acts by front line operators or others are not tolerated.

2.2.1  MAs should promote the creation of Air Safety Reporting systems based on Confidential reporting in a just culture among their service provider(s), Civil Aviation Administration and members.

2.2.2  IFATCA shall not encourage MAs to join Incident Reporting Systems unless provisions exists that adequately protects all persons involved in the reporting, collection and/or analysis of safety-related information in aviation.

2.2.3  Any incident reporting system shall be based on the following principles:

a)  in accordance and in co-operation with pilots, air traffic controllers and Air Navigation Service Providers;

b)  the whole procedure shall be confidential, which shall be guaranteed by law;

c)  adequate protection for those involved, the provision of which, be within the remit of an independent body.

References

IFATCA Manual 2003.

Eurocontrol website at https://www.eurocontrol.int.

Government Support Team of the Global Aviation Information Network (GAIN).

Aviation Safety Action Programs (ASAP) USA FAA.

FAA 14 CFR part 13.401, Flight Operational Quality Assurance Program.

ATSB Australia website at https://www.atsb.gov.au/.

FAA/JAA Annual Conference 5th June 2002. Phoenix, Arizona.

Institution of Electrical Engineers UK, Safety Culture.

The creation of an aviation safety reporting culture in Danish ATC; Peter Majgård Nørbjerg.

Last Update: September 29, 2020  

March 22, 2020   870   Jean-Francois Lepage    2004    

Comments are closed.


  • Search Knowledgebase