< Previous5-4 Safety Management Manual (SMM) 5.2.3 Accident and incident investigations Annex 13 requires States to establish and maintain an accident and incident database to facilitate the effective analysis of information on actual or potential safety deficiencies and to determine any preventive actions required. State authorities responsible for the implementation of the SSP should have access to the State accident and incident database to support their safety responsibilities. Additional information on which to base preventive actions may be contained in the Final Reports on accidents and incidents that have been investigated. 5.2.4 Safety investigations by State authorities or aviation service providers 5.2.4.1 According to the provisions in Annex 13, States are required to investigate accidents, as well as serious incidents of aircraft of a maximum mass of over 2 250 kg which have occurred in their territory. These investigations are conducted by the State’s accident investigation authority (AIA) in compliance with Annex 13. The conducting of such investigations may be delegated to another State or a regional accident and incident investigation organization (RAIO) by mutual arrangement and consent. 5.2.4.2 Safety investigations outside of those mandated by Annex 13 are encouraged as they provide useful safety information to support safety performance improvement. Additional information on service provider safety investigations can be found in Chapter 9. 5.2.5 Mandatory safety reporting systems 5.2.5.1 Annex 19 requires States to establish a mandatory safety reporting system that includes, but is not limited to, the reporting of incidents. The reporting systems developed by States and service providers should be made as simple as possible to access, generate and submit mandatory reports. Mandatory safety reporting systems should aim to capture all of the valuable information about an occurrence, including: what happened, where, when and to whom the report is intended. In addition, mandatory safety reporting systems should provide for the capture of some specific hazards which are known to contribute to accidents, the timely identification and communication of which is considered valuable (e.g. routine meteorological conditions, volcanic activity, etc.). 5.2.5.2 Regardless of the scope of the mandatory reporting system(s), it is recommended that all mandatorily collected reports be protected as per the principles detailed in Chapter 7. 5.2.5.3 Mandatory occurrence reporting systems tend to collect more technical information (e.g. hardware failures) than human performance aspects. To address the need for a greater range of safety reporting, States should also implement a voluntary safety reporting system. This aims to acquire more information, such as human factors related aspects, and enhance aviation safety. Reporting of accidents and incidents 5.2.5.4 Accident and incident reporting is relevant to every stakeholder in aviation. Operational personnel are required to report accidents and certain types of incidents as soon as possible and by the quickest means available to the State’s AIA. Serious incidents must be reported, a list of examples of incidents that are likely to be serious incidents may be found in Attachment C of Annex 13. 5.2.5.5 The following are two main aspects to consider when deciding whether an incident should be classified as a serious incident: a) Were there circumstances indicating that there was a high probability of an accident? b) Was the accident avoided only due to providence? Chapter 5. Safety Data Collection and Processing Systems 5-5 5.2.6 Voluntary safety reporting systems 5.2.6.1 Voluntary safety reporting systems should be established to collect safety data and safety information not captured by the mandatory safety reporting system. These reports go beyond typical incident reporting. Voluntary reports tend to illuminate latent conditions, such as inappropriate safety procedures or regulations, human error, etc. One way to identify hazards is through voluntary reporting. 5.2.6.2 States should accord protection to safety data captured by, and safety information derived from, voluntary safety reporting systems and related sources. States and service providers are advised to refer to Chapter 7 for guidance on how to apply the protection to safety data, safety information and related sources. Appropriate application of the protection will ensure the continued availability of safety data and safety information. States may also need to consider means to promote voluntary reporting. 5.2.7 Sector-specific safety reporting provisions Provisions for safety reporting systems continue to evolve. New sector-specific reporting requirements, such as fatigue and remotely piloted aircraft systems (RPAS), have been introduced more recently to address specific safety concerns and emerging aviation activities. Table 7 provides some examples of sector-specific reporting systems included in various Annexes, PANS and documents. Reporting System Reference For State / Service Provider Year of initial adoption / approval Aircraft Accident and incident investigation reporting Annex 13 — Aircraft Accident and Incident Investigation State 1951 Air traffic incident reporting PANS-ATM (Doc 4444), Procedures for Air Navigation Services — Air Traffic Management State and service provider 1970 Dangerous goods accident and incident reporting Annex 18 — The Safe Transport of Dangerous Goods by Air State 1981 Service difficulty reporting Annex 8 — Airworthiness of Aircraft State 1982 Air traffic incident reporting Doc 9426, Air Traffic Services Planning Manual, Part 2 service provider 1984 Wild-life/bird strike reporting Doc 9332, Manual on the ICAO Bird Strike Information System (IBIS) service provider 1989 Annex 14 — Aerodromes, Volume I — Aerodrome Design and Operations State and service provider 1990 Doc 9137, Airport Services Manual, Part 3 — Bird Control and Reduction State and service provider 1991 Laser emission reporting Doc 9815, Manual on Laser Emitters and Flight Safety State 2003 Fatigue reporting Annex 6 — Operation of Aircraft, Part I — International Commercial Air Transport — Aeroplanes service provider 2011 Doc 9966, Manual for the Oversight of Fatigue Management Approaches service provider 2012 Service difficulty reporting Doc 9760, Airworthiness Manual State 2014 Aerodrome safety reporting Doc 9981, Procedures for Air Navigation Services (PANS) - Aerodromes service provider 2014 Remotely piloted aircraft systems (RPAS) Doc 10019, Manual on Remotely Piloted Aircraft Systems (RPAS) Service provider 2015 5-6 Safety Management Manual (SMM) Reporting System Reference For State / Service Provider Year of initial adoption / approval In-flight incapacitation events and medical assessment findings Annex 1 — Personnel Licensing State 2016 Dangerous goods accident and incident reporting Doc 9284, Technical Instructions for the Safe Transport of Dangerous Goods by Air State and service provider 2017 Table 7. Examples of sector specific reporting systems in various Annexes, PANS and documents 5.2.8 Self-disclosure reporting systems Service providers’ systems for the collection of safety data through self-disclosure reporting systems, including, automatic data capture such as aviation safety action programme (ASAP) and FDA programmes (flight operations quality assurance (FOQA) programme, line operations safety audit (LOSA) and the normal operations safety survey (NOSS)) are examples of systems that capture safety data through direct observations of flight crews or air traffic controllers, respectively. All these systems permit recording successful system and human performance. Please see Chapter 7 for information regarding the protection of safety data and safety information captured by self-disclosure reporting systems and their sources. 5.2.9 Results of inspections, audits or surveys Results of interactions between State representatives and service providers, such as inspections, audits or surveys, can also be a useful input to the pool of safety data and safety information. The safety data and safety information from these interactions can be used as evidence of the efficacy of the surveillance programme itself. 5.2.10 Optimal safety data and safety information collection Much of the safety data and safety information used as the basis for data-driven decision-making comes from routine, everyday operations which is available from within the organization. The organization should first identify what specific question the safety data and safety information aims to answer or what problem needs to be addressed. This will help determine the appropriate resource and clarify the amount of data or information needed. 5.3 TAXONOMIES 5.3.1 Safety data should ideally be categorized using taxonomies and supporting definitions so that the data can be captured and stored using meaningful terms. Common taxonomies and definitions establish a standard language, improving the quality of information and communication. The aviation community's capacity to focus on safety issues is greatly enhanced by sharing a common language. Taxonomies enable analysis and facilitate information sharing and exchange. Some examples of taxonomies include: • Aircraft model: The organization can build a database with all models certified to operate. • Airport: The organization may use ICAO or International Air Transport Association (IATA) codes to identify airports. • Type of occurrence: An organization may use taxonomies developed by ICAO and other international organizations to classify occurrences. Chapter 5. Safety Data Collection and Processing Systems 5-7 5.3.2 There are a number of industry common aviation taxonomies. Some examples include: • ADREP: an occurrence category taxonomy that is part of ICAO’s accident and incident reporting system. It is a compilation of attributes and the related values that allow safety trend analysis on these categories. • Commercial Aviation Safety Team (CAST)/International Civil Aviation Organization (ICAO) Common Taxonomy Team (CICTT): tasked with developing common taxonomies and definitions for aircraft accident and incident reporting systems. • Safety Performance Indicators Task Force (SPI-TF): tasked with developing globally harmonized metrics for service providers’ SPIs as part of their SMS, to ensure uniformity in the collection of information and comparison of analysis results. 5.3.3 An excerpt of taxonomy from the CICTT is provided in Table 8 as an example only. Type Operation Activity/ infrastructure/system Value Aerodrome, Air Navigation Service Provider, Air Operation, Maintenance Organization, Design & Manufacturing Organization Regulator Lack of, poor or ineffective legislation and/or regulations Lack of or ineffective accident investigation capability Inadequate oversight capability Management Limited or lack of management commitment – Management do not demonstrate support for the activity Lack of or incomplete description of roles, accountabilities and responsibilities Limited or lack of resource availability or planning, including staffing Lack of or ineffective policies Incorrect or incomplete procedures including instructions Lack of or poor management and labour relationships Lack of or ineffective organizational structure Poor organizational safety culture Lack or ineffective audit procedures Lack of or limited resource allocation Table 8. Example of typical taxonomy 5.3.4 Hazard taxonomies are especially important. Identification of a hazard is often the first step in the risk management process. Commencing with a commonly recognized language makes the safety data more meaningful, easier to classify and simpler to process. The structure of a hazard taxonomy may include a generic and specific component. 5.3.5 The generic component allows users to capture the nature of a hazard with a view to aid in identification, analysis, and coding. A high-level taxonomy of hazards has been developed by the CICTT which classifies hazards in families of hazard types (Environmental, Technical, Organizational, and Human). 5-8 Safety Management Manual (SMM) 5.3.6 The specific component adds precision to the hazard definition and context. This enables more detailed risk management processing. The following criteria may be helpful when formulating hazard definitions. When naming a hazard, it should be: a) clearly identifiable; b) described in the desired (controlled) state; c) identified in accepted names; d) use non-judgmental adjectives (avoid terms like: poor, deficient); and e) avoid negative meanings or descriptions of absence (e.g. lack of). 5.3.7 Common taxonomies may not always be available between databases. In such a case, data mapping should be used to allow the standardization of safety data and safety information based on equivalency. Using an aircraft type example, a mapping of the data could show that a “Boeing 787-8” in one database is equivalent with a “788” in another. This may not be a straightforward process as the level of detail during safety data and safety information capture may differ. Most SDCPS will be configured to assist with the standardization of data capture, easing the burden of data mapping. 5.4 SAFETY DATA PROCESSING Safety data processing refers to the manipulation of safety data to produce meaningful safety information in useful forms such as diagrams, reports, or tables. There are a number of important considerations related to safety data processing, including: data quality; aggregation; fusion; and filtering. 5.4.1 Data quality 5.4.1.1 Data quality relates to data that is clean and fit for purpose. Data quality involves the following aspects: a) cleanliness; b) relevance; c) timeliness; and d) accuracy and correctness. 5.4.1.2 Data cleansing is the process of detecting and correcting (or removing) corrupt or inaccurate records from a record set, table, or database and refers to identifying incomplete, incorrect, inaccurate or irrelevant parts of the data and then replacing, modifying, or deleting the dirty or coarse data. 5.4.1.3 Relevant data is data which meets the organization’s needs and represents their most important issues. An organization should assess the relevance of data based on its needs and activities. 5.4.1.4 Safety data and safety information timeliness is a function of its currency. Data used for decisions should reflect what is happening as close to real time as possible. Judgement is often required based on the volatility of the situation. For example, data collected two years ago on an aircraft type still operating the same route, with no significant changes, may provide a timely reflection of the situation. Whereas, data collected one week ago on an aircraft type no longer in service may not provide a meaningful, timely reflection of the current reality. Chapter 5. Safety Data Collection and Processing Systems 5-9 5.4.1.5 Data accuracy refers to values that are correct and reflect the given scenario as described. Data inaccuracy commonly occurs when users enter the wrong value or make a typographical error. This problem can be overcome by having skilled and trained data entry personnel or by having components in the application such as spell check. Data values can become inaccurate over time, also known as “data decay”. Movement is another cause of inaccurate data. As data is extracted, transformed and moved from one database to another, it may be altered to some extent, especially if the software is not robust. 5.4.2 Aggregation of safety data and safety information Data aggregation is when safety data and safety information is gathered and stored in the organization’s SDCPS and expressed in a summary form for analysis. To aggregate safety data and safety information is to collect them together, resulting in a larger data set. In the case of SDCPS, individual items of safety data are aggregated into a database without giving one piece of safety data precedence over another. A common aggregation purpose is to get information about a particular group or type of activity based on specific variables such as: location; fleet type; or professional group. Data aggregation can sometimes be helpful across multiple organizations or regions that do not have enough data to ensure proper de-identification to protect the sources of the safety data and safety information, and to support analysis. 5.4.3 Data fusion Data fusion is the process of merging multiple safety data sets to produce more coherent, linked and useful safety data than that provided by any individual set of safety data. The integration of safety data sets followed by its reduction or replacement improves the reliability and usability of said data. Thus, for example, data from FDA systems of air operators could be merged with meteorological data and radar data to obtain a more useful data set for further processing. 5.4.4 Filtering of safety data and safety information Safety data filtering refers to a wide range of strategies or solutions for refining safety data sets. This means the data sets are refined into simply what the decision-maker needs, without including other data that can be repetitive, irrelevant or even sensitive. Different types of data filters can be used to generate reports, query results, or other ways to communicate results. 5.5 SAFETY DATA AND SAFETY INFORMATION MANAGEMENT 5.5.1 Safety data and safety information management can be defined as the development, execution and supervision of plans, policies, programmes and practices that ensure the overall integrity, availability, usability, and protection of the safety data and safety information used by the organization. 5.5.2 Safety data and safety information management which addresses the necessary functions will ensure that the organization’s safety data and safety information is collected, stored, analysed, retained and archived, as well as how it is governed, protected and shared, as intended. Specifically, it should identify: a) what data will be collected; b) data definitions, taxonomy and formats; c) how the data will be collected, collated and integrated with other safety data and safety information resources; 5-10 Safety Management Manual (SMM) d) how the safety data and safety information will be stored, archived and backed up, for example, database structure and if an IT system, supporting architecture; e) how the safety data and safety information will be used; f) how the information is to be shared and exchanged with other parties; g) how the safety data and safety information will be protected, specific to the safety data and safety information type and source; and h) how quality will be measured and maintained. 5.5.3 Without clearly defined processes to produce safety information, an organization cannot achieve defensible, reliable, and consistent information upon which data-driven decisions are confidently made. 5.5.4 Data governance Data governance is the authority, control and decision-making over the processes and procedures that support an organization’s data management activities. It dictates how safety data and safety information are collected, analysed, used, shared and protected. Data governance ensures that the data management system(s) has the desired effect through the key characteristics of: integrity, availability, usability and protection as described below. Integrity ~ Data integrity refers to the reliability of the resources, information, and events it contains. However, data integrity includes the maintenance and the assurance of the accuracy and consistency of data over its entire life-cycle. This is a critical aspect to the design, implementation and usage of the SDCPS when storing, processing, or retrieving the data. Availability ~ It should be clear who has permission to use or share the stored safety data and safety information. This has to take into account the agreement between the data/information owner and custodian. For the entities that are allowed to use the data, it should be clear how to gain access and how to process it. A variety of techniques exist to maximize data availability, including redundancy of storage locations and data access methods and tools. Usability ~ In order to maximize returns on safety data and safety information, it is important to also consider usability standards. Humans are continuously interacting and engaging with safety data and safety information as they are acquired. Organizations should minimize human error as automation applications are applied. Tools which can increase usability include data dictionaries and metadata repositories. As human interaction evolves towards big data applications and machine learning processes, it will become increasingly important to better understand human usability as it is applied to machines to minimize safety data and safety information miscalculations in the future. Protection ~ States should ensure that safety data, safety information and related sources are afforded appropriate protection. For more information refer to Chapter 7. 5.5.5 Metadata management 5.5.5.1 Metadata is defined as a set of data that describes and gives information about other data, in other words, data about data. Using metadata standards provides a common meaning or definition of the data. It ensures proper use and interpretation by owners and users, and that data is easily retrieved for analysis. Chapter 5. Safety Data Collection and Processing Systems 5-11 5.5.5.2 It is important that organizations catalogue their data based on its properties, including but not limited to: a) what the data is; b) where it comes from (the original resource); c) who created it; d) when was it created; e) who used it; f) what was it used for; g) frequency of collection; and h) any processing or transformation. 5.5.5.3 Metadata provides a common understanding of what the data is and ensures correct use and interpretation by its owners and users. This can also identify errors in the data collection which leads to continuous improvements of the program. ______________________From Development to ImplementationThe theme of the conference is “From Development to Implementation”, encompasses implementation of operational improvements, such as technology, operational concepts, and roadmaps, from the conceptual phase until deployment. It emphasizes the importance of concepts for global use, development of implementation plans regionally, and implementation of performance improvements locally, based on specific operational requirements in a cost-effective manner. The conference is a formal ICAO meeting that will provide an opportunity for Member States and aviation stakeholders to work towards ever-evolving global strategies for safety and air navigation planning, development and implementation. It will coalesce views of the global aviation community around major objectives for safety and air navigation and set priorities for the coming years. The operation of the air navigation system covers a wide variety of subjects. AN-Conf/13 will be the event for open discussion on issues and proposed solutions relating to flight safety and air navigation capacity, efficiency and other key performance areas of particular interest to the aviation community and which will benefit society in general. The in-depth technical discussions at AN-Conf/13, will take place in two committees — the Air Navigation Committee (Committee A) and the Safety Committee (Committee B), will result in realistic global plans and a work programme focused on the pressing and forecasted needs of international civil aviation. At the conference, subject matter experts will participate in detailed technical discussions which are expected to lead to agreement on a set of high-level recommendations in different key performance areas of the air navigation system. The technical discussions at the conference will lead to a more efficient and effective decision-making process during the ICAO Assembly allowing the focus to be on strategic issues based on sound technical advice.For more information, please contact mcr@icao.int or visit the website: www.icao.int/meetings/ANConf13.ICAO Thirteenth Air Navigation ConferenceAN-Conf/13ICAO Headquarters 9-19 October 2018 6-1 Chapter 6 SAFETY ANALYSIS 6.1 INTRODUCTION 6.1.1 Safety analysis is the process of applying statistical or other analytical techniques to check, examine, describe, transform, condense, evaluate and visualize safety data and safety information in order to discover useful information, suggest conclusions and support data-driven decision-making. Analysis helps organizations to generate actionable safety information in the form of statistics, graphs, maps, dashboards and presentations. Safety analysis is especially valuable for large and/or mature organization with rich safety data. Safety analysis relies on the simultaneous application of statistics, computing and operations research. The result of a safety analysis should present the safety situation in ways that enable decision makers to make data-driven safety decisions. 6.1.2 States are required to establish and maintain a process to analyse the safety data and safety information from the SDCPS and associated safety databases. One of the objectives of safety data and safety information analysis at the State level is the identification of systemic and cross-cutting hazards that might not otherwise be identified by the safety data analysis processes of individual service providers. 6.1.3 Safety analysis may be a new function the State or service provider may need to establish. It should be noted that the required competencies to conduct effective safety analysis might be outside of the purview of a traditional safety inspector. States and service providers should consider the skills necessary to analyse safety information and decide whether this role, with appropriate training, should be an extension of an existing position or whether it would be more efficient to engage the skills separately, outsource the role all together, or a hybrid of these approaches. The decision will be driven by the plans and circumstances of each State or service provider. 6.1.4 In parallel with the human resourcing considerations should be an analysis of the existing software, and business and decision-making policies and processes. To be effective, the safety analysis should be integrated with the organization’s existing core tools, policies and processes. Once amalgamated, the ongoing development of safety intelligence should be seamless and part of the organization’s usual business practice. 6.1.5 Safety data and safety information analysis can be conducted in many ways, some requiring more robust data and analytic capabilities than others. The use of suitable tools for analysis of safety data and safety information provides a more accurate understanding of the overall situation by examining the data in ways that reveal the existing relationships, connections, patterns and trends that exist within. 6.1.6 An organization with a mature analysis capability is better able to: a) establish effective safety metrics; b) establish safety presentation capabilities (e.g. safety dashboard) for ready interpretation of safety information by decision makers; c) monitor safety performance of a given sector, organization, system or process; d) highlight safety trends, safety targets; e) alert safety decision makers, based on safety triggers; f) identify factors that cause change; Next >