< Previousicao.int/storeICAO Aviation Law Library (ALL) is a NEW product, which was developed to serve as a key reference tool for aviation law, international treaties and air services agreements to ICAO States and a larger industry audience, not limited only to government legal experts and aviation lawyers, but also including airline route planners, aviation analysts, academic researchers and consultants. The new product will be offered at three subscription levels aligned with the needs of different customers groups. The original WASH database has been updated with many new agreements including those between individual States and regional economic groupings such as the EU. In addition, the new Aviation Law Library (ALL) will include a full collection of ICAO legal documents including all international agreements and conventions affecting civil aviation and the technical Annexes to the Chicago Convention. 4-1 Chapter 4 SAFETY PERFORMANCE MANAGEMENT 4.1 INTRODUCTION 4.1.1 Safety performance management is central to the functioning of SSPs and SMSs. Properly implemented, it will provide an organization with the means to determine whether its activities and processes are working effectively to achieve its safety objectives. This is accomplished through the identification of safety performance indicators (SPIs), which are used to monitor and measure safety performance. Through the identification of SPIs, information obtained will allow senior management to be aware of the current situation and support decision-making, including determining whether actions are required to further mitigate safety risks to ensure the organization achieves its safety goals. 4.1.2 The generic safety performance management process and how it is linked with safety data collection and processing systems (SDCPS) and safety analysis, discussed in Chapters 5 and 6, respectively, is shown in Figure 7 below. The link to safety promotion is shown to highlight the importance of communicating this information throughout the organization. More information on safety promotion, an important component of SSP and SMS which is often underappreciated, can be found in Chapters 8 and 9, respectively. Figure 7. Safety Performance Management Process 4-2 Safety Management Manual (SMM) 4.1.3 Safety performance management helps the organization to ask and to answer the four most important questions regarding safety management: 1. What are the organization’s top safety risks? 2. What does the organization want to achieve in terms of safety and what are the top safety risks that need to be addressed? The organization’s safety objectives. 3. How will the organization know if it is making progress toward its safety objectives? Through SPIs, SPTs and, if practicable, safety triggers. 4. What safety data and safety information are needed to make informed safety decisions? Including the allocation of the organization’s resources. SDCPS and safety analysis. 4.1.4 The safety performance management process can also be used to establish an acceptable level of safety performance (ALoSP). More details on the establishment of an ALoSP can be found in Chapter 8. 4.1.5 Relationship between States and service providers 4.1.5.1 There are similarities between the State and service providers in the use and application of safety performance techniques. While the guidance in this chapter has been developed for both States and service providers, some differences are identified in this section. 4.1.5.2 The development of State safety performance should focus on what the State considers to be its most important aspects to manage safety. For the State, an effectively implemented SSP is used as a decision-making tool for the management of safety performance which should include: the safety performance of their service providers; the State’s oversight capability; and the support provided to service providers through the establishment of guidelines. States should consider measuring its ability to: a) maintain its safety oversight system; b) apply specific safety actions and introduce safety initiatives; and c) adapt existing safety risk controls to ensure they remain effective. 4.1.5.3 For service providers, the primary function of safety performance management is to monitor and measure how well it is managing its safety risks. This is achieved through the effective implementation of an SMS that generates information that will be used to make decisions regarding the management of safety, including the implementation of safety risk controls and the allocation of resources. 4.1.5.4 The success of safety management depends on the commitment between the State and its service providers. There may be benefits in the State identifying suitable SPIs that could be monitored by service providers and then shared with the State, in particular for the establishment of the ALoSP (see Chapter 8 for more information). The information received from service providers will assist the State with its assessment of the safety performance of its aviation industry and its own ability to provide effective oversight and support to service providers. However, service providers should ensure their SPIs are appropriate to their operational context, performance history and expectations. 4.1.6 Safety performance management and interfaces 4.1.6.1 When States and service providers are considering implementing safety management, it is important to consider the safety risks induced by interfacing entities. Interfaces can be internal (e.g. between operations and maintenance or finance, human resources or legal departments), or they can be external (e.g. other State, service providers or contracted services). Hazards and related risks at the interface points are among the most Chapter 4. Safety Performance Management 4-3 common contributors to safety occurrences. States and service providers have greater control over interface-related risks when their interfaces are identified and managed. Interfaces should be defined in the organization’s system description. 4.1.6.2 States and service providers are responsible for ongoing monitoring and management of their interfaces to ensure safe outcomes. The safety risk posed by each interface should, ideally, be collaboratively assessed by the interfacing entities. Collaboration is highly desirable because the perception of safety risk and their consequences may vary between the interfacing organizations. Sharing of interface risk management, through the establishment and monitoring of SPIs, encourages the mutual awareness of safety risks rather than ignorance or potentially one-sided risk management. It also creates an opportunity for transfer of knowledge and working practices that could improve the safety effectiveness of both organizations. 4.1.6.3 For this reason, SPIs should be agreed and established to monitor and measure the risks and the effectiveness of mitigating actions. A formal interface management agreement between interfacing organizations, with clearly defined monitoring and management responsibilities, is an example of an effective approach. 4.2 SAFETY OBJECTIVES 4.2.1 Safety objectives are brief, high-level statements of safety achievements or desired outcomes to be accomplished. Safety objectives provide direction to the organization’s activities and should therefore be consistent with the safety policy that sets out the organization’s high-level safety commitment. They are also useful to communicate safety priorities to personnel and the aviation community as a whole. Establishing safety objectives provides strategic direction for the safety performance management process and provides a sound basis for safety related decision-making. The management of safety performance should be a primary consideration when amending policies or processes, or allocating the organization’s resources in pursuit of improving safety performance. 4.2.2 Safety objectives may be: a) process-oriented: stated in terms of safe behaviours expected from operational personnel or the performance of actions implemented by the organization to manage safety risk; or b) outcome-oriented: encompass actions and trends regarding containment of accidents or operational losses. 4.2.3 The suite of safety objectives should include a mix of both process-oriented and outcome-oriented objectives to provide enough coverage and direction for the SPIs and SPTs. Safety objectives on their own do not have to be specific, measurable, achievable and timely (SMART) (George T. Doran, 1981), provided the safety objectives and accompanying SPIs and SPTs form a package that allows an organization to demonstrate whether it is maintaining or improving its safety performance. Examples of safety objectives process-oriented State or service provider Increase safety reporting levels. outcome-oriented service provider Reduce rate of adverse apron safety events. (high-level) or Reduce the annual number of adverse apron safety events from the previous year. outcome-oriented State Reduce the annual number of safety events in sector X. Table 6. Examples of safety objectives 4-4 Safety Management Manual (SMM) 4.2.4 An organization may also choose to identify safety objectives at the tactical or operational level or apply them to specific projects, products and processes. A safety objective may also be expressed by the use of other terms with a similar meaning (e.g. goal or target). 4.3 SAFETY PERFORMANCE INDICATORS AND SAFETY PERFORMANCE TARGETS 4.3.1 Types of safety performance indicators Qualitative and quantitative indicators 4.3.1.1 SPIs are used to help senior management know whether or not the organization is likely to achieve its safety objective; they can be qualitative or quantitative. Quantitative indicators relate to measuring by the quantity, rather than its quality, whereas qualitative indicators are descriptive and measure by quality. Quantitative indicators are preferred over qualitative indicators because they are more easily counted and compared. The choice of indicator most importantly depends on the availability of reliable data that can be measured quantitatively. Does the necessary evidence need to be in the form of comparable, generalizable data (quantitative), or a descriptive image of the safety situation (qualitative)? Each option, qualitative or quantitative, involves different kinds of SPIs, which may best be achieved by a thoughtful SPI selection process. A combination of approaches is useful in many situations, and can solve many of the problems which may arise from adopting a single approach. An example of a qualitative indicator for a State could be the maturity of their service providers’ SMS in a particular sector or for a service provider the assessment of the safety culture. 4.3.1.2 Quantitative indicators can be expressed as a number (x incursions) or as a rate (x incursions per n movements). In some cases, a numerical expression will be sufficient. However, just using numbers may create a distorted impression of the actual safety situation if the level of activity fluctuates. For example, if air traffic control records three altitude busts in July and six in August, there may be great concern about the significant deterioration in safety performance. But August may have seen double the movements of July meaning the altitude busts per movement, or the rate, has decreased, not increased. This may or may not change the level of scrutiny, but it does provide another valuable piece of information that may be vital to the data-driven safety decision-making. 4.3.1.3 For this reason, where appropriate, SPIs should be reflected in terms of a relative rate to measure the performance level regardless of the level of activity. This provides a normalized measure of performance; whether the activity increases or decreases. As another example, an SPI could measure the number of runway incursions. But if there were fewer departures in the monitored period, the result could be misleading. A more accurate and valuable performance measure would be the number of runway incursions relative to the number of movements, e.g. x incursions per 1 000 movements. Lagging and leading indicators 4.3.1.4 The two most common categories used by States and service providers to classify their SPIs are lagging and leading. Lagging SPIs measure events that have already occurred. They are also referred to as “outcome-based SPIs” and are normally (but not always) the negative outcomes the organization is aiming to avoid. Leading SPIs measure processes and inputs being implemented to improve or maintain safety. These are also known as “activity or process SPIs” as they monitor and measure conditions that have the potential to become or to contribute to a specific outcome. 4.3.1.5 Lagging SPIs help the organization understand what has happened in the past and are useful for long-term trending. They can be used as a high-level indicator or as an indication of specific occurrence types or locations, such as “types of accidents per aircraft type” or “specific incident types by region”. Because lagging SPIs measure safety outcomes, they can measure the effectiveness of safety mitigations. They are effective at validating the overall safety performance of the system. For example, monitoring the “number of ramp collisions per number of movements between vehicles following a redesign of ramp markings” provides a measure of the effectiveness of the Chapter 4. Safety Performance Management 4-5 new markings (assuming nothing else has changed). The reduction in collisions validates an improvement in the overall safety performance of the ramp system; which may be attributable to the change in question. 4.3.1.6 Trends in lagging SPIs can be analysed to determine conditions existing in the system that should be addressed. Using the previous example, an increasing trend in ramp collisions per number of movements may have been what led to the identification of sub-standard ramp markings as a mitigation. 4.3.1.7 Lagging SPIs are divided into two types: • low probability/high severity: outcomes such as accidents or serious incidents. The low frequency of high severity outcomes means that aggregation of data (at industry segment level or regional level) may result in more meaningful analyses. An example of this type of lagging SPI would be “aircraft and/or engine damage due to bird strike”. • high probability/low severity: outcomes that did not necessarily manifest themselves in a serious accident or incident, these are sometimes also referred to as precursor indicators. SPIs for high probability/low severity outcomes are primarily used to monitor specific safety issues and measure the effectiveness of existing safety risk mitigations. An example of this type of precursor SPI would be “bird radar detections”, which indicates the level of bird activity rather than the amount of actual bird strikes. 4.3.1.8 Aviation safety measures have historically been biased towards SPIs that reflect “low probability/high severity” outcomes. This is understandable in that accidents and serious incidents are high profile events and are easy to count. However, from a safety performance management perspective, there are drawbacks in an over-reliance on accidents and serious incidents as a reliable indicator of safety performance. For instance, accidents and serious incidents are infrequent (there may be only one accident in a year, or none) making it difficult to perform statistical analysis to identify trends. This does not necessarily indicate that the system is safe. A consequence of a reliance on this sort of data is a potential false sense of confidence that an organization or system’s safety performance is effective, when it may in fact be perilously close to an accident. Figure 8. Leading vs Lagging indicator concept phases 4-6 Safety Management Manual (SMM) 4.3.1.9 Leading indicators are measures that focus on processes and inputs that are being implemented to improve or maintain safety. These are also known as “activity or process SPIs” as they monitor and measure conditions that have the potential to become or to contribute to a specific outcome. 4.3.1.10 Examples of leading SPIs driving the development of organizational capabilities for proactive safety performance management include such things as “percentage of staff who have successfully completed safety training on-time” or “frequency of bird scaring activities”. 4.3.1.11 Leading SPIs may also inform the organization about how their operation copes with change, including changes in its operating environment. The focus will be either on anticipating weaknesses and vulnerabilities as a result of the change or monitoring the performance after a change. An example of SPIs to monitor a change in operations would be “percentage of sites that have implemented procedure X”. 4.3.1.12 For a more accurate and useful indication of safety performance, lagging SPIs, measuring both “low probability/high severity” events and “high probability/low severity” events should be combined with leading SPIs. Figure 8 illustrates the concept of leading and lagging indicators that provides a more comprehensive and realistic picture of the organization’s safety performance. 4.3.2 Selecting and defining SPIs 4.3.2.1 SPIs are the parameters that provide the organization with a view of its safety performance: where it has been; where it is now; and where it is headed, in relation to safety. This picture acts as a solid and defensible foundation upon which the organization’s data-driven safety decisions are made. These decisions, in turn, positively affect the organization’s safety performance. The identification of SPIs should therefore be realistic, relevant, and linked to the safety objectives, regardless of their simplicity or complexity. 4.3.2.2 It is likely the initial selection of SPIs will be limited to the monitoring and measurement of parameters representing events or processes that are easy and/or convenient to capture (safety data that may be readily available). Ideally, SPIs should focus on parameters that are important indicators of safety performance, rather than on those that are easy to attain. 4.3.2.3 SPIs should be: a) related to the safety objective they aim to indicate; b) selected or developed based on available data and reliable measurement; c) appropriately specific and quantifiable; and d) realistic, by taking into account the possibilities and constraints of the organization. 4.3.2.4 A combination of SPIs is usually required to provide a clear indication of safety performance. There should be a clear link between lagging and leading SPIs. Ideally lagging SPIs should be defined before determining leading SPIs. Defining a precursor SPI linked to a more serious event or condition (the lagging SPI) ensures there is a clear correlation between the two. All of the SPIs, lagging and leading, are equally valid and valuable. An example of these linkages is illustrated in Figure 9. Chapter 4. Safety Performance Management 4-7 Figure 9. Examples of links between lagging and leading indicators 4.3.2.5 It is important to select SPIs that relate to the organization’s safety objectives. Having SPIs that are well defined and aligned, will make it easier to identify SPTs, which will show the progress being made towards the attainment of safety objectives. This allows the organization to assign resources for greatest safety effect by knowing precisely what is required, and when and how to act to achieve the planned safety performance. For example, a State has a safety objective of “reduce the number of runway excursions by fifty per cent in three years” and an associated, well aligned SPI of “number of runway excursions per million departures across all aerodromes”. If the number of excursions drops initially when monitoring commences, but starts to climb again after twelve months, the State could choose to reallocate resources away from an area where, according to the SPIs, the safety objective is being easily achieved and towards the reduction of runway excursions to alleviate the undesirable trend. Defining SPIs 4.3.2.6 The contents of each SPI should include: a) a description of what the SPI measures; b) the purpose of the SPI (what it is intended to manage and who it is intended to inform); c) the units of measurement and any requirements for its calculation; d) who is responsible for collecting, validating, monitoring, reporting and acting on the SPI (these may be staff from different parts of the organization); e) where or how the data should be collected; and f) the frequency of reporting, collecting, monitoring and analysis of the SPI data. SPIs and safety reporting 4.3.2.7 Changes in operational practices may lead to under-reporting until their impact is fully accepted by potential reporters. This is known as “reporting bias”. Changes in the provisions related to the protection of safety information and related sources could also lead to over-reporting. In both cases, reporting bias may distort the intent and accuracy of the data used for the SPI. Employed judiciously, safety reporting may still provide valuable data for the management of safety performance. 4-8 Safety Management Manual (SMM) 4.3.3 Setting safety performance targets 4.3.3.1 Safety performance targets (SPTs) define short-term and medium-term safety performance management desired achievements. They act as “milestones” that provide confidence that the organization is on track to achieving its safety objectives and provide a measurable way of verifying the effectiveness of safety performance management activities. SPT setting should take into consideration factors such as the prevailing level of safety risk, safety risk tolerability, as well as expectations regarding the safety of the particular aviation sector. The setting of SPTs should be determined after considering what is realistically achievable for the associated aviation sector and recent performance of the particular SPI, where historical trend data is available. 4.3.3.2 If the combination of safety objectives, SPIs and SPTs working together are SMART, it allows the organization to more effectively demonstrate their safety performance. There are multiple approaches to achieving the goals of safety performance management, especially, setting SPTs. One approach involves establishing general high-level safety objectives with aligned SPIs and then identifying reasonable levels of improvements after a baseline safety performance has been established. These levels of improvements may be based on specific targets (e.g. percentage decrease) or the achievement of a positive trend. Another approach which can be used when the safety objectives are SMART is to have the safety targets act as milestones to achieving the safety objectives. Either of these approaches are valid and there may be others that an organization finds effective at demonstrating their safety performance. Different approaches can be used in combination as appropriate to the specific circumstances. Setting targets with high-level safety objectives 4.3.3.3 Targets are established with senior management agreeing on high-level safety objectives. The organization then identifies appropriate SPIs that will show improvement of safety performance towards the agreed safety objective(s). The SPIs will be measured using existing data sources, but may also require the collection of additional data. The organization then starts gathering, analysing and presenting the SPIs. Trends will start to emerge, which will provide an overview of the organization’s safety performance and whether it is steering towards or away from its safety objectives. At this point the organization can identify reasonable and achievable SPTs for each SPI. Setting targets with SMART safety objectives 4.3.3.4 Safety objectives can be difficult to communicate and may seem challenging to achieve; by breaking them down into smaller concrete safety targets, the process of delivering them is easier to manage. In this way, targets form a crucial link between strategy and day-to-day operations. Organizations should identify the key areas that drive the safety performance and establish a way to measure them. Once an organization has an idea what their current level of performance by establishing the baseline safety performance, they can start setting SPTs to give everyone in the State a clear sense of what they should be aiming to achieve. The organization may also use benchmarking to support setting performance targets. This involves using performance information from similar organizations that have already been measuring their performance to get a sense of how others in the community are doing. 4.3.3.5 An example of the relationship between safety objectives, SPIs and SPTs is illustrated below. In this example, the organization recorded 100 runway excursions per million movements in 2018. It has been determined this is too many, and an objective to reduce the number of runway excursions by fifty per cent by 2022 has been set. Specific targeted actions and associated timelines have been defined to meet these targets. To monitor, measure and report their progress, the organization has chosen “RWY excursions per million movements per year” as the SPI. The organization is aware that progress will be more immediate and effective if specific targets are set which align with the safety objective. They have therefore set a safety target which equate to an average reduction of 12.5 per year over the reporting period (four years). As shown in the graphical representation, the progress is expected to be greater in the first years and less so in the later years. This is represented by the curved projection towards their objective. In the example: • the SMART safety objective is “50 per cent reduction in RWY excursions rate by 2022”; Chapter 4. Safety Performance Management 4-9 • the SPI selected is the “number runway excursions per million movements per year”; and • the safety targets related to this objective represent milestones for reaching the SMART safety objective and equate to a ~12 per cent reduction each year until 2022; o SPT 1a is “less than 78 runway excursions per million movement in 2019”; o SPT 1b is “less than 64 runway excursions per million movement in 2020”; o SPT 1c is “less than 55 runway excursions per million movement in 2021”. Figure 10. Example SPTs with SMART safety objectives Additional considerations for SPI and SPT selection 4.3.3.6 When selecting SPIs and SPTs, the following should also be considered: • Workload management. Creating a workable amount of SPIs can help personnel manage their monitoring and reporting workload. The same is true of the SPIs complexity, or the availability of the necessary data. It is better to agree on what is feasible, and then prioritize the selection of SPIs on this basis. If an SPI is no longer informing safety performance, or been given a lower priority, consider discontinuing in favour of a more useful or higher priority indicator. • Optimal spread of SPIs. A combination of SPIs that encompass the focus areas will help gain an insight to the organization’s overall safety performance and enable data-driven decision-making. • Clarity of SPIs. When selecting an SPI, it should be clear what is being measured and how often. SPIs with clear definitions aid understanding of results, avoid misinterpretation, and allow meaningful comparisons over time. Next >