Association of State Dam Safety Officials/ Federal Emergency Management Agency In Association with the US Society on Dams Working Group on Dam Safety Risk Assessment Committee on Dam Safety Specialty Workshop on Risk Assessment for Dams June 2001 Hosted and Organized by Institute for Dam Safety Risk Management Utah State University Summary of Workshop Findings1 Failure Modes Identification (FMI) Approaches 1) Failure Modes Identification, which is an early step in performing a risk assessment, should also be standard practice for traditional standards-based approaches to dam safety evaluation and design. 2) Failure Modes Identification provides a more comprehensive safety evaluation of a dam and a basis for strengthening many aspects of a dam safety program (e.g. instrumented and visual monitoring, emergency preparedness planning, O&M, etc.). 3) Guidance is urgently needed for performing Failure Modes Identification. 4) Users must recognize that Failure Modes Identification is a qualitative approach and not a decision tool. Portfolio Risk Assessment (PRA) Approaches: 1) PRA is a valuable and increasingly accepted approach for cost effectively prioritizing dam safety remedial measures and further investigations for a group of dams. 2) It provides insights that can better inform owners about the business and liability implications of dam ownership. 3) PRA outcomes must be used with regard for the limitations of the approach and should be periodically updated. Index Prioritization Approaches: 1) Index approaches are a valuable and increasingly utilized approach for prioritizing dam safety issues and investigations, but should be calibrated and must incorporate a risk metric to be considered risk-based. 2) They are generally less costly to use than PRA, but are more limited in the scope of their outcomes. Detailed Quantitative Risk Assessment (QRA) Approaches 1) Detailed QRA approaches are valuable for providing insights and understanding of failure modes and associated risks (probability and consequences) for stakeholders. 2) Uncertainties in inputs and outcomes must be taken into account. 3) Improved approaches to estimation of probabilities and consequences are needed. 1 Developed in Consolidation Session of Workshop and revised by USSD Working Group, July 10, 2000. i 4) Acceptable/tolerable risk criteria need development and are yet to gain widespread acceptance. 5) Stakeholders must decide on issues of appropriate use and defensibility. ii Table of Contents 1.0 Background and Purpose of Workshop ............................................................................................1 1.1 Sponsorship......................................................................................................................................1 1.2 Purpose.............................................................................................................................................1 1.3 Use of the Term “Risk Assessment”.................................................................................................2 1.4 Workshop Format .............................................................................................................................4 1.5 Report Purpose and Outline..............................................................................................................5 2.0 Outline of Workshop Methodology..................................................................................................6 2.1 Introduction......................................................................................................................................6 2.2 Workshop Activities .........................................................................................................................7 2.2.1 Introductory session ..................................................................................................................7 2.2.2 Review of the state-of-the-practice of dam safety risk assessment...........................................7 2.2.3 Identification of research needs ................................................................................................7 2.2.4 Recommendation of approaches for addressing needs .............................................................8 2.2.5 Summary of findings.................................................................................................................8 2.3 Strategic Planning Process................................................................................................................8 2.3.1 The input phase.........................................................................................................................8 2.3.2 The research category identification phase...............................................................................8 2.3.3 The research category prioritization phase ...............................................................................9 2.3.4 The research proposal development phase................................................................................9 3.0 Information Needs for Dam Safety Evaluation and Management..................................................11 3.1 Introduction....................................................................................................................................11 3.2 Government Owner Information Needs - John Smart, USBR, Denver, Colorado .........................12 3.3 Large Private Owner – David Bowles, Utah State University/RAC Engineers & Economists......12 3.3.1 Regulatory environment..........................................................................................................12 3.3.2 Commercial context for dam safety decisions ........................................................................12 3.3.3 Risk treatment options: ...........................................................................................................13 3.3.4 Outcome targeting...................................................................................................................14 3.3.5 Investment drivers – Information needs..................................................................................15 3.4 Small Private Owner Information Needs – Jim Doane, Bureau of Water Works, Portland, Oregon 16 3.4.1 Discussion..............................................................................................................................16 3.4.2 What are the information needs of the small dam owner?......................................................17 3.4.3 Presentation notes ...................................................................................................................18 3.5 Federal Regulator Information Needs - Dan Mahoney, FERC, Washington, D.C.........................19 3.6 State Regulator Information Needs.................................................................................................20 3.6.1 A state dam safety regulator’s perspective- Stephen Verigin, California Division of Safety of Dams, Sacramento, California ............................................................................................................20 3.6.2 Another state dam safety regulator’s perspective – Doug Johnson, State of Washington, Olympia, Washington........................................................................................................................21 3.7 Consulting Engineer Information Needs - John W. France, URS Corporation, Denver, Colorado 23 4.0 Assessment of State of the Practice ................................................................................................24 4.1 Introduction....................................................................................................................................24 4.2 Failure Modes Identification Approaches (Qualitative Approaches) .............................................24 4.2.1 Strengths ................................................................................................................................24 4.2.2 Limitations.............................................................................................................................25 4.3 Index Prioritization Approaches .....................................................................................................25 4.3.1 Strengths ................................................................................................................................25 4.3.2 Limitations.............................................................................................................................27 i 4.4 Portfolio Risk Assessment Approaches..........................................................................................27 4.4.1 Strengths ................................................................................................................................27 4.4.2 Limitations.............................................................................................................................27 4.5 Detailed Quantitative Risk Assessment Approaches ......................................................................29 4.5.1 Strengths ................................................................................................................................29 4.5.2 Limitations.............................................................................................................................29 5.0 Technology Transfer and Training Needs.......................................................................................31 6.0 Research and Development Needs..................................................................................................33 6.1 Introduction....................................................................................................................................33 6.2 Low Hanging Fruit - Easy and Important.......................................................................................38 6.2.1 Priority 1 – (7, 18, 19) Prioritization and portfolio tools (F) ..................................................38 6.2.2 Priority 2 – (13) Database of failure case histories (K) ..........................................................38 6.3 Strategic Plan - Hard and Important ...............................................................................................38 6.3.1 Priority 3 – (2, 6) Tolerable risk/criteria (B)...........................................................................38 6.3.2 Priority 4 – (15) Flood loading (M)........................................................................................39 6.3.3 Priority 5 – (8) Earthquake response (G)................................................................................39 6.3.4 Priority 6 – (10, 21) Improve loss of life estimates (I)............................................................39 6.3.5 Priority 7 – (12) Risk communication (J) ...............................................................................40 6.3.6 Priority 8 – (3) Subjective probability (C)..............................................................................40 6.4 Do Later - Easy but Less Important................................................................................................41 6.4.1 Priority 9 – (5) Uncertainty (E)...............................................................................................41 6.4.2 Priority 10 – (16) Risk process (N).........................................................................................41 6.4.3 Priority 11 – (4) Skills to identify failure modes (D)..............................................................41 6.4.4 Priority 12 – (1) Standards (A) ...............................................................................................41 6.4.5 Priority 13 – (9) Static response (H).......................................................................................41 6.4.6 Priority 14 - Portfolio - Learn to improve (S).........................................................................42 6.5 Consider - Hard and Less Important...............................................................................................42 6.5.1 Priority 15 - Earthquake loading (L).......................................................................................42 6.6 Research Proposals .........................................................................................................................42 7.0 Integrated Approach to Meeting Research Needs...........................................................................52 8.0 References......................................................................................................................................55 APPENDICES Appendix A. Workshop Agenda Appendix B. List of Participants Appendix C. List of Handouts Appendix D. Participants Expectations and Issues Appendix E. Participant Input on Information Needs for Dam Safety Evaluation and Management Appendix F. Participant Input on Failure Modes Identification (Qualitative Approaches) Appendix G. Participant Input on Portfolio and Index Approaches (Prioritization and Portfolio Approaches) Appendix H. Participant Input on Quantitative Approaches Appendix I. Sorted Participant Input on Strengths and Limitations of the State of the Practice Appendix J. Participant Voting on Technology Transfer and Training Needs Appendix K. Participant Input on Research and Development Needs Categories ii 1.0 Background and Purpose of Workshop 1.1 Sponsorship The ASDSO/FEMA Specialty Workshop on Risk Assessment for Dams was held March 7 – 9, 2000 at Utah State University (USU), Logan, Utah. The workshop was one of a series of Dam Safety Research Workshops, which are funded by the FEMA National Dam Safety Program Act (NDSPA, P.L. 104-303). ASDSO was the contractor to FEMA. Through the Institute for Dam Safety Risk Management, USU subcontracted to the ASDSO to host and organize the workshop. The ASDSO established a Steering Committee chaired by Doug Johnson, Supervisor, Dam Safety, State of Washington and an ASDSO Board Member. The workshop was linked to the Working Group on Risk Assessment of the USSD (formerly USCOLD) Committee on Dam Safety. This linkage was through the participation of Working Group members in the workshop, and through the use of the workshop to develop the basis for a USSD White Paper on Dam Safety Risk Assessment. 1.2 Purpose The purpose of the workshop was as follows: To conduct a review of the state-of-the-practice of dam safety risk assessment, to identify research needs, and to recommend an approach for addressing these needs. For the purposes of the workshop, we interpreted “state-of the-practice” to include only approaches that are currently being used (i.e. in practice) by dam owners and their engineers to provide inputs for dam safety decisions. We did not limit the types of decisions to only the selection of a target level of safety for an existing dam or a proposed remedial measure. Instead, we included any type of decision that affects any aspect of dam safety, including monitoring and instrumentation, reservoir operating level, investigations, and emergency preparedness planning. By “research needs” we understood the interest of the National Dam Safety Program to encompass both short-term (i.e. immediate) and long-term research and development needs, including technology transfer needs. These may include such areas as the following: a vision for the future of applications of risk assessment to dam safety, training in its application, and tools to facilitate its application by practitioners. Identified research needs were to be passed on to the ICODS Research Subcommittee for their consideration in recommending the use of FEMA National Dam Safety Program Act funds for research projects. A group of experienced dam safety professionals was invited to participate in the workshop. The group was drawn from a broad cross-section of employment affiliations, and a mixture of those with and without risk assessment experience. The workshop was not intended to be a gathering of only those with expertise in dam safety risk assessment. Nor was it intended to be an opportunity to cross-fertilize risk assessment practice from other fields into the dam safety field, as some have suggested. While these are worthwhile objectives, it was not possible to combine them with the objectives established by FEMA. Future workshops should be considered to pursue these purposes. At the outset of the workshop, we recognized that different information needs can exist for different stakeholders in any given dam safety decision. Thus, information that may play an essential role in an 1 owner’s decision-making process may not be needed at all by a regulator who oversees the owner’s decision outcomes. Since the information needs of different organizations can vary widely, we recognized that it would be unrealistic to expect that any single approach to risk assessment would meet the needs of all organizations. Therefore, an introductory workshop session was devoted to identifying, "Information needs for dam safety evaluation and management" for the following six types of organizations: the government owner, the large private owner, the small private owner, the federal regulator, the state regulator, and the consulting engineer. The outcomes of this session were used to form a broad basis for evaluating the strengths and limitations of a range of risk assessment approaches and for identifying research needs. Thus, the workshop did not recommend one particular method of risk assessment for all dam safety organizations. In addition to this report to FEMA, major products from the workshop have included a ring binder containing copies of all presentations and other handouts provided to participants (listed in Appendix C), a bibliography, and the USSD White Paper on Dam Safety Risk Assessment. A draft of the Summary of Findings was distributed at the USCOLD Annual Lecture in Seattle in June 2000 and was presented at the ICOLD 2000 Congress in Beijing. A summary document containing the Summary of Findings and the priorities for technology transfer and research and development was provided to the ICODS Research Subcommittee for its July 2000 meeting. A panel presentation of workshop findings was included at the USCOLD 2000 Annual Lecture and the ASDSO 2001 Annual Conference. 1.3 Use of the Term “Risk Assessment” The term “Risk Assessment” appears in the title of this workshop. It is a term that does not have a universally accepted meaning and is frequently misused. Below we define this term and several others that are needed to appreciate the format of the workshop. Most of these definitions are taken from a draft of the ICOLD Bulletin on risk assessment (Version 10, August 2000). Their use does not imply any endorsement of the draft bulletin by the workshop participants, organizers, or sponsors. Their interrelationship is illustrated in Figure 1.1. Failure Modes Identification Risk Analysis Risk Assessment Dam Safety Risk Management Risk Estimation Risk Evaluation Risk Control -Structural -Recurrent activities -Reassessment Figure 1.1. Interrelationship between components of risk assessment and risk management (Bowles et al 1999) 2 Failure Modes Identification: A procedure by which potential failure modes in a technical system are identified. Risk: A measure of the likelihood and severity of adverse consequences (National Research Council 1983). Risk is estimated by the mathematical expectation of the consequences of an adverse event occurring (i.e. the product of the probability of occurrence and the consequence) or, alternatively, by the triplet of scenario, probability of occurrence and the consequence. (ICOLD 2000) Risk Analysis: The use of available information to estimate the risk to individuals or populations, property or the environment, from hazards. Risk analyses generally contain the following steps: scope definition, hazard identification, and risk estimation (ICOLD 2000). Risk Assessment: The process of deciding whether existing risks are tolerable and present risk control measures are adequate and if not, whether alternative risk control measures are justified. Risk assessment incorporates the risk analysis and risk evaluation phases (ICOLD 2000). Risk Control: The implementation and enforcement of actions to control risk, and the periodic re-evaluation of the effectiveness of these actions (ICOLD 2000). Risk Estimation: The process of quantifying the probability and consequences components of risk. Risk Evaluation: The process of examining and judging the significance of risk (ICOLD 2000). Risk Identification: The process of determining what can go wrong, why and how (ICOLD 2000). Risk Management: The systematic application of management policies, procedures and practices to the tasks of identifying, analyzing, assessing, treating and monitoring risk (ICOLD 2000). When we use the term “risk assessment” in this report it refers to a process that includes at least one of the components that make up the overall process of risk assessment (see Figure 1.1). For example, in the next section we mention that approximately one half the workshop participants were known to have some experience with applying risk assessment to dams. That does not mean that each experienced participant has used all component processes that comprise risk assessment in Figure 1.1. Some may only have experience with one component process, such as failure modes identification. 3 1.4 Workshop Format Workshop participants came mainly from the US dam engineering community, but included two representatives from Australia and four from Canada. The 32 participants included four state regulators, two federal regulators, five large private owners, one local government owner, four federal government owners, three industry associations, and eight consulting engineers, and five academics with significant consulting experience. Just over one half of the participants were known to have some level of experience with applying risk assessment to dam safety problems. The workshop organizing group comprised the following: David Achterberg (USBR and ICODS Research Subcommittee), Doug Johnson (State of Washington and ASDSO), Dan Mahoney (FERC and ASCE Task Committee on Risk Assessment of Dams and Hydroelectric Facilities), Lori Spragens (ASDSO), and David Bowles, Chair (Utah State University/RAC Engineers & Economists). In preparing the workshop agenda, the organizing group recognized that although the primary purpose of the workshop was not training, it would be necessary to provide some presentations of the current state- of-the-practice, especially for the benefit of those with limited or no risk assessment experience. This review also provided an important basis for identifying those areas in which research and development is needed to strengthen the current state-of-the-practice. The workshop agenda is presented in Appendix A. It included presentations and facilitated consensus building sessions for the following three areas of risk assessment applications: • Failure Modes Identification (referred to as “Qualitative Approaches” in the agenda) • Portfolio Risk Assessment and Index Prioritization Approaches (referred to as “Prioritization and Portfolio Approaches” in the agenda) • Detailed Quantitative Approaches (referred to as “Quantitative Approaches” in the agenda) The organizing group divided applications into these three areas based on the observation that the degree of acceptance of risk assessment approaches seemed to be markedly different in each area. In the consolidation session, at the end of the workshop, it was agreed to further divide Portfolio Risk Assessment and Index Prioritization into two approaches because it was recognized that although they shared some common attributes they had significantly different scopes and some differing strengths and limitations. Thus, this report presents the assessment of the state-of-the-practice and research needs for four risk assessment applications areas. Dr. David Harris of the USBR served as the Workshop Facilitator. Overall outcomes of the workshop were consolidated into prioritized technology transfer and training needs and research and development needs to be provided to FEMA and the ICODS Research Subcommittee. An additional consolidation session was held to discuss the use of workshop outcomes in the USSD White Paper. Most participants were provided electronic or hard copies of the following documents: • Guidelines for Dam Safety Risk Management, Dam Safety Interest Group of the Canadian Electricity Association, Interim issue of Part 1 of a four part document. • A Guide to Risk Management for UK Reservoirs, Construction Industry Research Information Association (CIRIA), Draft 3, October 1999. • Dam Safety Risk Analysis Methodology, Technical Services Center, USBR, Version 3.3, September 1999. • Reducing Risks, Protecting People, UK Health and Safety Executive, 1999 Draft Version. • Risk Assessment as an Aid to Dam Safety Management, Draft ICOLD Bulletin, 1999. 4 In addition, a bibliography was developed by USU and distributed at the workshop. 1.5 Report Purpose and Outline The purpose of this report is to document the purpose, methodology and outcomes of the Specialty Workshop. This report is not intended to include any commentary on the findings reached. The USSD White Paper will be the forum for such commentary. This report is divided into seven chapters and eleven appendices. Section 2.0 contains a summary of the methodology that was used to achieve the workshop outcomes specified in the workshop purpose. Section 3.0 summarizes the information needs that were identified by speakers and participants. Workshop outcomes are summarized in Sections 4.0 – 6.0. The assessment of the strengths and weaknesses of the four major areas of current practice is presented in Section 4.0. Prioritized technology transfer and training needs are presented in Section 5.0. Prioritized research needs are presented in Section 6.0. Section 7.0 proposes an integrated approach comprising twelve overall research projects that address both the technology transfer and training and the research and development needs. Appendices A, B and C contain the workshop agenda, list of participants, and list of handouts, respectively. Appendix D contains participant input on expectations and issues for the workshop. Appendix E contains participant input on information needs for dam safety evaluation and management. Appendices F, G and H contain participant input on failure modes identification (qualitative approaches), portfolio and index approaches (prioritization and portfolio approaches), and quantitative approaches, respectively. Appendix I contains sorted participant input on strengths and limitations for each risk assessment application area. Appendix J contains participant voting on technology transfer and training needs and Appendix K contains participant input on research and development needs categories. 5 2.0 Outline of Workshop Methodology 2.1 Introduction The stated workshop purpose (see Section 1.1) can be divided into three parts, as follows: 1) To conduct a review of the state-of-the-practice of dam safety risk assessment 2) To identify research needs 3) To recommend an approach for addressing these needs Workshop products in each of these areas were developed through a coordinated set of workshop activities, which are summarized in Section 2.2. These activities included presentations, discussions, obtaining participant inputs, consensus categorization of inputs into Research and Development (R&D) needs and Technology Transfer & Training (T3) needs, voting on the importance and difficulty of each category, and development of research proposals. Underlying the workshop activities was a strategic planning process, which is summarized in Section 2.3. The interrelationship between workshop activities and the strategic planning process is represented schematically in Figure 2.1. Presentations Participant Inputs 1) Introductory Session Workshop Objectives 1.2 & 2.1 Expectations App. D Issues App. D Information Needs 3.0 Information Needs App. E 2) - 4) State of the Practice Failure Modes Identification Index Prioritization Approaches Portfolio Risk Assessment Detailed Quantitative Approaches Risk Assessment Application Areas: 1) Strengths 2) Limitations 3) T3 Needs 4) R&D Needs App. F, G & H Workshop Outcomes Assessment of Strengths & Limitations 4.0 & App. I 5) Consolidation of Outcomes Technology Transfer & Training Needs 5.0, 7.0 & App. J Research & Development Needs 6.0, 7.0 & App. K Figure 2.1. Overall interrelationship between workshop activities and strategic planning process. 6 2.2 Workshop Activities 2.2.1 Introductory session At the outset of the workshop, statements on the Workshop Objectives were made by Doug Johnson, representing the ASDSO, Gus Tjoumas, for USCOLD, and David Bowles for the Organizing Group. Participants were then asked to state both their expectations for the workshop and issues that they would like to see addressed during the workshop. Input was collected from participants on index cards, read aloud by the facilitator, Dr David Harris, and displayed on a board at the front of the room. Participant input on expectations and issues is listed in Appendix D. No attempt was made to collate this input. However, items from both lists were incorporated into research needs categories at the end of the workshop. 2.2.2 Review of the state-of-the-practice of dam safety risk assessment Presentations on the state-of-the-practice were made in the following three applications areas: 1) Failure Modes Identification (referred to as “Qualitative Approaches” in the agenda) 2) Portfolio Risk Assessment and Index Prioritization Approaches (referred to as “Prioritization and Portfolio Approaches” in the agenda) 3) Detailed Quantitative Approaches (referred to as “Quantitative Approaches” in the agenda) At the completion of presentations for each of these areas, input was collected on index cards from participants to address the following questions applied to each application area: 1) What are its strengths? 2) What are its limitations? 3) What are its Technology Transfer & Training Needs? 4) What are its Research and Development needs? Responses to Questions 1 and 2 formed the basis for the evaluation of the current state-of-the-practice in each application area. A preliminary categorization of strengths and weaknesses by the Organizing Group Chair was reviewed and revised at a meeting of the USSD Working Group on Dam Safety Risk Assessment at the June 2000 USSD Annual Lecture. The Working Group also divided inputs between the Index Prioritization and Portfolio Risk Assessment application areas. The results of the review of the state-of-the-practice in the four risk assessment application areas are summarized in Section 4.0. Detailed inputs are presented in Appendices E. 2.2.3 Identification of research needs Research needs were divided into two types as follows: Research and Development (R&D) needs and Technology Transfer & Training (T3) needs. Inputs for identifying research needs were obtained from the responses to Questions 1 – 4 (see Section 2.2.2) for each of the application areas, the participant’s inputs on expectations and issues, and other inputs, which were made at various other times, such as during question and answer sessions following presentations. All inputs were categorized, as described in Section 2.3.2. In reviewing T3 needs at a meeting of the USSD Working Group on Dam Safety Risk Assessment at the June 2000 USSD Annual Lecture, the Working Group suggested some additional T3 approaches, which were incorporated into workshop recommendations. 7 The identified T3 and R&D needs are summarized in Sections 5.0 and 6.0, respectively. Detailed inputs are presented in Appendices F, G, and H. An integrated research plan, which combined both T3 and R&D needs, was developed by the Organizing Group Chair and is presented in Section 7.0. This was also provided to the ICODS Research Subcommittee for consideration at its July 2000 meeting. 2.2.4 Recommendation of approaches for addressing needs Categorized research needs were prioritized following a process described in Section 2.3.3. These prioritizations are also presented in Sections 5.0 and 6.0. Small groups of participants provided suggestions for the ICODS Research Subcommittee to use in deciding how to follow-up on several priority research needs using a format presented in Section 2.3.4. The notes prepared by each group are presented in Section 6.6. 2.2.5 Summary of findings A consolidation session was held at the end of the Workshop to prepare a draft of the Summary of Workshop Findings. This draft was reviewed and revised at a meeting of the USSD Working Group on Dam Safety Risk Assessment at the June 2000 USSD Annual Lecture. A table of contents for this report was drafted during the consolidation session and the draft outline for the USSD Working Paper was reviewed and revised. Both the report and working paper outlines were further reviewed and revised at the June 2000 meeting of the USSD Working Group. 2.3 Strategic Planning Process Dr. David W. Harris from the U.S. Bureau of Reclamation Laboratories served as the facilitator for the Workshop. Dr Harris has served in this capacity for other FEMA Research Workshops. In all cases he has used a Strategic Planning Process, based on the IBM “MetaPlan” approach. The following description of the four phases of this planning process is adapted from a general description prepared by Dr. Harris. 2.3.1 The input phase Input from participants was collected on index cards, a few words per card. All participants did this simultaneously. The intent of this step was to collect as many ideas as possible from a fairly large group in a time efficient manner. The cards were collected by the facilitator as completed, or at any time during the session. The cards were read aloud by the facilitator and displayed on a board, sorted into columns of similar topics, at the front of the room. All participants were encouraged to take part in the interaction to determine which column to place each card in, although perfect distinctions were not necessary in this phase. 2.3.2 The research category identification phase With all cards sorted into columns, the test of distinction was to see if a heading could be established for each column. Some movement of initial cards was necessary during this process. New cards were added 8 at any time as participants thought of new ideas, wanted to clarify their previously submitted ideas, or found items that may belong in more than one category. The continued intention was to collect as much information as is possible in a limited time. The heading for any given column became a research category with different aspects or possible tasks detailed within the column. 2.3.3 The research category prioritization phase Participants were next asked to cast a total of ten votes for the importance that they associated with each category. Votes were recorded using ten glued dots that were placed by each participant on the board next to each column heading. Each participant was permitted to distribute their voting dots across all the categories. It was permitted to use as many as three dots for any one category to represent increased importance of that category to the participant. All votes were counted for each research category. The votes were used to create bar charts for the research categories as shown in Figures 5.1 in Section 5.0 and Figures 6.2 and 6.3 in Section 6.1. The larger the number of votes, the greater the importance that was assigned to a particular research category. A second vote took place based on the perceived difficulty of each research category. Difficulty could be interpreted to mean expensive, technically challenging, complex, or some other measure of difficulty, which the participant chose for any given category. In this case each participant assigned each and every research category a score between 0 and 10, with 0 being easy and 10 being really hard. Participant scores were averaged. These data provided a second dimension for prioritizing research categories. When plotted this produced a decision quad of the research categories. The decision quad was formed by four quadrants of the “difficulty-importance” votes, each of which was given a descriptive name, as follows: • Low Hanging Fruit - Easy and important • Strategic Items - Hard and important • Do later - Easy but less important • Consider - Hard and less important The resulting decision quad is presented in Figure 6.3 in Section 6.1. 2.3.4 The research proposal development phase Workshop participants chose a research category and then worked with others in small groups to further develop each research idea. This provided additional input for use by the ICODS Research Subcommittee. The suggested form of the input was to address six “W” questions, as follows: Who What Why Where When hoW An example of the work sheet provided for this purpose is contained in Figure 2.2. 9 Topic developed for Research Title: (describe the research item in 10 words or less) Description: a. Why is this a priority research item? b. What is the expected outcome? Project Tasks and Needs (What (tasks) is to be done and How (needs) is this problem to be solved?) Project Lead and Contract: a. Who is working in this area? b. Who might be able to lead the project? c. Who are good candidates to complete the work? Figure 2.2. Example of work sheet provided for the Research Proposal Development Phase 10 3.0 Information Needs for Dam Safety Evaluation and Management 3.1 Introduction As mentioned in Section 1.2, the information needs of different organizations can vary widely. It was therefore recognized by the workshop-organizing group that it would be unrealistic to expect that any single approach to risk assessment would meet the needs of all organizations. An introductory workshop session was devoted to identifying, "Information needs for dam safety evaluation and management" for the following six types of organizations: the government owner, the large private owner, the small private owner, the federal regulator, the state regulator, and the consulting engineer. The presentations made in this session are summarized below in Sections 3.2 – 3.7. No attempt has been made to adapt the presentations to fit a common format. The facilitator led the participants in an exercise to summarize information needs. The result was a list, which is presented in Appendix E.1. Each of the major topics in the list was expanded into some notes following the format of Table 3.1. These notes are presented in Appendix E.2. Identified information needs were intended to be used by participants to form a broad basis for evaluating the strengths and limitations of a range of risk assessment approaches and for identifying research needs. Table 3.1. Format for Notes on Information Needs Information needs for dam safety evaluation and management What: (Name of a need) Who: (Needs this) Why/When: (Do they need it) Where will it be used: (In-house, public meetings) How will it be used: 11 3.2 Government Owner Information Needs - John Smart, USBR, Denver, Colorado • Risks associated with all dams owned • Risks that should be reduced • Risks that should be reduced in the short-term • Risk management options that make most effective use of available resources in the risk identification and risk reduction processes • Credibility in all of the above • Uncertainties associated with all of the above • Legal and political constraints that may affect the implementation of risk management actions 3.3 Large Private Owner – David Bowles1, Utah State University/RAC Engineers & Economists 3.3.1 Regulatory environment The regulatory environment in which a private dam owner operates can have a significant influence on the approach to dam safety management. Cases of hard, soft and no dam safety regulator are contrasted below: • Hard – FERC, California, New South Wales Dam Safety Committee, Australia - Regulatory requirements may completely determine dam safety program and fixes • Soft – Utah, Victoria, Australia - Less influence of regulatory requirements - Greater flexibility in rate and extent of fixes -BUT, what are the drivers? • None – US Bureau of Reclamation, US Army Corps of Engineers, Tasmania, Australia - No regulatory requirements - AGAIN, what are the drivers? 3.3.2 Commercial context for dam safety decisions A private dam owner must find a feasible approach to dam safety management within the various constraints and goals that determine the commercial context within which it exists, such as the following: • Rate of return target • Safety goal • Pricing constraint • Borrowings limit This is illustrated in Figure 3.1 1 An employee of a large private owner had been assigned the task of providing the perspective of a large private owner, but unfortunately he had to withdraw shortly before the workshop. Other participants who are associated with large private owners did not feel that they could address this topic at short notice and so David Bowles provided this perspective. He based his contribution on the information needs that have been identified to him by large private owners for whom he has worked as a dam safety management consultant. 12 Rate of Return Target Feasible Alternatives Safety Goal Pricing Constraint Infeasible Borrowings Limit Alternatives Figure 3.1. Illustration of the commercial context for identifying a feasible dam safety management program 3.3.3 Risk treatment options: From a business or management perspective, risk treatment options can be grouped into the following categories, although they are “not necessarily mutually exclusive or appropriate in all circumstances” [AS/NZS 1995]: • Avoid the risk—this is a choice that can be made before a dam is built, or perhaps through decommissioning an existing dam. • Reduce (prevent) the probability of occurrence—typically through structural measures, or dam safety management activities such as monitoring, surveillance, and periodic inspections. • Reduce (mitigate) the consequences—for example, by effective emergency evacuation planning or by relocating exposed populations at risk. • Transfer the risk—for example, by contractual arrangements or title transfer of an asset. • Retain (accept) the risk—after risks have been reduced or transferred, residual risks are retained and may require risk financing (e.g., insurance). Figure 3.2 illustrates these categories of risk treatment. 13 Figure 3.2 Categories of risk treatment 3.3.4 Outcome targeting The right side of Figure 3.3 represents the information or outcome “targets” that can benefit a private owner’s dam safety program and related business processes in addition to other stakeholders in dam safety decisions. Some dam owners focus only on externally imposed requirements such as those of a regulator or engineering standards or guidelines without giving adequate consideration to internal considerations such as business criticality or alternatives for replacing project functionality (e.g. dam decommissioning), which might be less costly than dam safety rehabilitation. It is important that an effective outcome targeting process be accomplished, for example, at the outset of the portfolio risk assessment (PRA) process. It is also important that the PRA process is adapted to meet the specific information needs associated with each portfolio of dams rather than develop a standard set of outcomes. Figure 3.3 also depicts the flow of information inputs into a PRA from activities that already exist in most dam safety programs (e.g. inspections, design reviews, etc.). It also shows the addition of specialized information, which may be needed to complete a PRA (e.g. inundation modeling and consequences estimation). 14 EXISTING DAM SAFETY ADDITIONAL BUSINESS RISK INFORMATION INFORMATION INFORMATION FLOWS NEEDS NEEDS (THE TARGET) DAM SAFETY: Monitoring and surveillance, EPRP, Assessment Program Improvement Program Inspections Inundation modeling O & M Program Design Reviews (FMEA) Emergency Preparedness and Response Planning Monitoring and Surveillance Incremental Consequence Categories Consequences BUSINESS PROCESSES: Capital budgeting/financing, Risk management/insurance, Due diligence and legal liability assessment, Contingency planning and contractual obligations, Public relations and consultation, Etc. Portfolio Risk Assessment Figure 3.3. Capturing PRA inputs and targeting and integrating PRA outcomes into the owner’s dam safety program and business processes. 3.3.5 Investment drivers – Information needs In summary, the drivers that can influence private owners’ dam safety decisions can include the following: • Regulatory Considerations - Breaches of regulations, legal requirements and licenses • Public Safety - Engineering Standards/Guidelines and Current Practice - Benchmarking - Risk-based guidelines - Benchmarking - As low as reasonably practicable (ALARP) principle - Extent of potential life loss - Community and political expectations • Legal Liability - Duty of care, due diligence - “Reasonable person” – benchmarking (timing and extent) -Negligence of owner - Engineer’s liability position 15 • Retention of Insurance Cover • Business Viability/Financial - Third party liability and cost of lawsuits - Organizational breakup, public enquiries, restrictive legislation - Effects on key business results areas -Loss of revenue generation - Competitive position, dividends - Opportunities forgone/postponed • Public Trust and Reputation - Customers - Extent of adverse impact on internal and external customers - External Perceptions - Extent of adverse community or political response on owner - Public consultation • Additional Factors: - Cost effectiveness of fix(es)/staging - Priority relative to other dams/assets - Opportunity for increased capacity - Effects of delays/staging -Non-structural options 3.4 Small Private Owner Information Needs – Jim Doane, Bureau of Water Works, Portland, Oregon 3.4.1 Discussion In order to gain the perspective of the small private dam owner, we need to determine what separates the small private dam owner from other dam owners. For the purpose of this discussion, I'll define the small private dam owner as a person or non-federal organization that owns no more than ten dams -- the dams can be small or large (of any hazard classification). For the purpose of this discussion, I am limiting it to the issues of small private dam owners who, if asked, would say and believe they are responsible dam owners. What separates small private dam owners from other dam owners is that the operation and maintenance of the dam is not the core business of the organization but a way for providing water for the core business. For the small private dam owner, the storage of water is a way of providing for the core business be it water supply for irrigation, water supply for municipal purposes, flood control, water for flow augmentation, or water for industrial purposes. Small private dam owners also store water for hydroelectric production, recreation, cooling, etc. One other characteristic of the small private dam owner is that their focus on the core business may lead to a situation where they do not understand the business and societal risks associated with the ownership, operation and maintenance of dams. They may look upon the risks much more casually than they should -- they may try to deal with them as underestimated normal business risks without even factoring in the societal risks. The lack of understanding of the risks and the nature of these owners (they tend to have small technical staffs) frequently leads to the situation where their staff is too small to have a resident dam expert available. It is even less likely that if there is a corporate risk manager, that risk manager will understand the risks associated with dams. It is unlikely that the majority of small private dam owner will have anyone with much knowledge of the concepts we are talking about in Logan today. 16 I found that the power companies seem to have in-house staffs that have a very good handle on the technical issues and many of the risk issues that come with dam ownership. This may be the result of the power companies having sufficient technical staff with knowledge of dams, dealing successfully with other business risks (and regulators), and a basic understanding of risks of dams. Other small dam owners might have a person responsible for dams on their technical staffs but that person frequently had other work as their primary focus. Most small dam owners were dependent on the work of consultants to actually deal with the technical issues associated with dam ownership. Most small dam owners did not have an understanding of the societal risk issues or of the risk concepts we are talking about here. Fortunately, the federal and state regulators did a good job of bringing the potential problems to the attention of the small dam owner. What are the issues that tend to keep the owners of small private dams from understanding the issues inherent in having dams? I found that the owners, managers or boards are focused on their core business. They don't view their core business as having much to do with dam ownership. These owners seem to understand and generally fully appreciate the risks in what they view as their core business. They are striving to understand deregulation, new competitiveness, privatization, tight budgets, changes caused by endangered species listings, etc. New demands are placed on them every day. In this circumstance it is easy for them to just follow the lead of the regulators for dams as they follow the lead of regulators in so many aspects of their business. These owners tend to view the standard of the regulators as sufficient if not overly conservative. The safety record of dams may also lead them into a sense of security. 3.4.2 What are the information needs of the small dam owner? The small dam owner needs to know the basics of managing all the risks inherent in the operation of the business. Concerning dams, the owner needs to be able to: • Determine how to integrate or rely on someone who can integrate the commercial and societal risks of owning, operating, and maintaining dams into the overall risk management of the organization. • Understand or rely on someone who understands the societal risk of dam ownership and know the impact that not managing that risk could have on the organization. • Have knowledge of, or rely on someone who has a basic understanding of, mechanisms that result in common types of dam failures. • Have knowledge of or rely on someone who has knowledge of probability as well as the basic elements that go into the risk analysis of dams -- especially the limitations and uncertainties. • Understand that the amount of analysis that is required to address a specific problem is dependent on several factors: – The complexity of the problem being studied (generally the harder the problem, the more involved the analysis), – The reason the problem is being studied (is the analysis being internally or externally driven?), – The consequence of not managing the problem (does the failure result in the loss of a small amount of corporate resource or perhaps injury and losses to third parties?), – The degree of certainty desired (how sure does the owner need to be?), and – The amount of scrutiny anticipated by internal and external organizations and stakeholders 17 The owner should also understand that peer review of any significant analysis is always very desirable. The more complex the analysis and significant the outcome, the greater the need for peer review. We need to somehow convince the small dam owner that risk analysis of dams is important without seeming like just another demand on the owner's time. This is really a powerful tool that can be used to help the small dam owner make good corporate decisions … decisions that can protect what the owner believes is the core business of the organization. 3.4.3 Presentation notes Small Dam Owner • Generally an individual or organization that owns one or a few dams (<10) of any size or hazard classification. • Ownership, operation, and maintenance of the dams is not generally the core business of the organization. Dams are used to store water to provide for the delivery of the core business: • Water supply (irrigation, municipal, flood control, flow augmentation) • Hydroelectric power • Recreation, etc. Primary focus on their core business (i.e., Issues other than dams): • May not understand the business and societal risks associated with the ownership, operation and maintenance of dams. Only a few structures to deal with: • May not be able to have experts on staff or available as consultants to deal with emerging relatively sophisticated concepts such as risk assessment. Changing Environment: • Deregulation • Tight budgets • Endangered species listings • Elected board or chairperson who may not have the background in risk issues Issues of Owners: • Business and societal risks inherent in dam ownership may not be fully appreciated or understood. • Business risks or other issues associated with the core business fully appreciated and understood. • Standards of the regulators may be deemed sufficient. • Excellent safety record of dams may also cause a lack of appreciation for the risks. Information Needs of Small Dam Owners: 18 • Need to know the basics of risk management for all risks at dams. • Need to have access to or an understanding of: - Business risk of their core operations and relation to business risk of being the owner of dams. - Societal risk of being an owner of dams and the impact of not managing that risk. • Basic knowledge of the elements that go into a risk analysis for dams. • Limitations and uncertainties of the risk assessment process. Knowledge that the amount of analysis required must be related to: • Complexity of the problem • Reason the problem is being addressed • Consequences of not managing the problem • Degree of certainty desired • Scrutiny of internal and external organizations • Desirability of having the work reviewed by peers The owner should have: • Basic understanding of common definitions used in the risk analysis and evaluation of dams. • Basic understanding of the mechanisms that result in common types of dam failures. • Basic understanding of probability (in order to be able to interpret the results). • Understanding of the aversion of the general population to risk from dams. Conclusion: Risk assessment and risk evaluation can be used to help a small dam owner: • Learn about the business and societal risks of dam ownership. • Prioritize the various risks at a dam or for a group of dams. • Determine the relative risk of owning dams to other corporate risk. • Determine the overall risk that is acceptable. 3.5 Federal Regulator Information Needs - Dan Mahoney, FERC, Washington, D.C. Regulator Perspective: • There are benefits from risk assessment for dam safety evaluations Where Risk Assessment Could be Used Effectively: • Process gives a comprehensive, thorough evaluation of structure • Prioritization of risks for owners of many dams • Fixing dam safety deficiencies, which represent the highest risk first • More definitive understanding of “hazard” rating Dispel Notion: 19 • Risk assessment means not fixing dams Hurdles for Regulators: • Procedures and practices that are universal and accepted • Common understanding and definitions • Probabilities of extreme events are accurate and based on solid science • Impact on conclusions of “Low” probabilities of extreme events Major Hurdles for Regulators: • Concept of “allowable levels of Loss of Life” • Current methods of calculating Loss of Life from population at risk Challenge for Workshop: • Concept of “allowable levels of Loss of Life” • Current methods of calculating Loss of Life from populations at risk 3.6 State Regulator Information Needs 3.6.1 A state dam safety regulator’s perspective- Stephen Verigin, California Division of Safety of Dams, Sacramento, California 1. Need a procedure to quickly and easily classify dam safety risk. (Hazard classification rating.) 2. All dams that pose any potential loss of life and/or significant loss of property are high hazard. 3. Where there is (high) exposure to loss of life and/or property, use the very highest design requirements. 4. Use risk to identify problems but not as a basis for safety. 5. Establish a maximum size beneath which there is no risk to life or property. 6. Establish a minimum size above which the most conservative design standards should be used. 7. When using a hazard classification rating system to set work and resource priorities, do not assume that a low priority dam is safe. Accept it as a low priority with respect to risk exposure. 8. Most states must show that there is an actual threat to life and property and then must ensure that dams are designed and constructed with a reasonable factor of safety against failure. 9. Do not use risk analysis to avoid making necessary (and costly?) repairs. Owners have options of operating safe dams or removing them from service. A third alternative should not be placing life or property in peril because the cost of repair is too high. 20 10. Do not depend on emergency action plans or early warning systems to save lives. Time of failure, duration of failure, and complexity of evacuations prevent this from being a safety feature. An EAP is a response feature that will hopefully limit losses. 11. Risk analysis is not used in the design of new dams. Why is it appropriate for use on existing dams? Methodology 1. The database of dam failures, when used to predict where problems will occur in the future, is not a strong tool. It is most likely a measure of past engineering standard deficiencies, undeveloped technology, or poor design and construction practices. It is not a measure of random phenomenon (i.e. piping is more likely in nature than rare storm events). 2. The numbers used to calculate the probabilities used in risk analysis are subjective, leading to results that have a very weak link to actual probabilistic forecasts. Good engineering judgment and a proactive inspection program are much more reliable. 3.6.2 Another state dam safety regulator’s perspective – Doug Johnson, State of Washington, Olympia, Washington In general, I would agree that Mr. Verigan’s comments apply to most of the state dam safety programs. However, there are a few states that utilize risk-based standards, such as Washington and Montana. Furthermore, I think that all states could benefit from the knowledge of what level of risk their standards provide, even if they use deterministic standards. A key issue is using percent-PMP as a design event for smaller dams where loss of a few lives is possible. Once you move away from PMP you have no idea what level of protection is provided, unless you can determine the probability of the percent-PMP event. Thus, since some of the states use percent-PMP as a design standard, they are already accepting risk, only they have no idea of what level of risk they are facing! While I think it would be far more useful to approach this from the risk side and determine "acceptable risk" for these smaller dams, I understand that some states are not comfortable with this concept. However, all states could benefit from understanding the risks posed by their dams in decision-making. Based on these points, I submit the following comments (in italics) to Mr. Verigan’s points. 1. Need a procedure to quickly and easily classify dam safety risk. (Hazard classification rating and dam break analysis) 2. All dams that pose any potential loss of life and/or significant loss of property are high hazard. Although this is now the federal definition, not all states follow this. Washington still has a significant hazard rating with 1 or 2 homes at risk. I know several states that have this set in their regulations. 3. Where there is (high) exposure to loss of life and/or property, use the very highest design requirements. Agreed, but the highest design requirements shouldn't kick in where only a few lives at risk. This is why most states use percent-PMP for smaller dams with a few lives at risk. 4. Use risk to identify problems but not as a basis for safety. -Many states may feel this way, but not Montana and Washington. And actually, once the states allow percent-PMP as a design event, where lives are at risk, they are accepting risk as a basis for safety. However, we don't know in most cases what level of risk a percentage of PMP gives. This is a very important area where research is needed. 21 5. Establish a maximum size beneath which there is no risk to life or property. This would be nice, but it really all depends on the project. I have some six-foot high dams that are riskier than 20-foot high dams. 6. Establish a minimum size above which the most conservative design standards should be used. Also should consider hazard setting 7. When using a hazard classification rating system to set work and resource priorities, do not assume that a low priority dam is safe. Accept it as a low priority with respect to risk exposure. Agreed 8. Most states must show that there are an actual threat to life and property and then must ensure that dams are designed and constructed with a reasonable factor of safety against failure. Agreed, but the problem is defining “reasonable". There are probably 50 different opinions on this one. I think it would be very useful to the states to know what probability is associated with their specified design levels. That would really help in decision-making. 9. Do not use risk analysis to avoid making necessary (and costly?) repairs. Owners have options of operating safe dams or removing them from service. A third alternative should not be placing life or property in peril because the cost of repair is too high. I understand this is a feeling shared by many critics of risk analysis. It's viewed a way of getting out of doing anything at a dam. Again, for very large dams with thousands of lives at risk, I agree wholeheartedly. But most of the dams regulated by the states fall into the gray area, small dams with a few lives at risk. The standards set for these smaller dams can be determined by the level of risk posed, not by an arbitrary percentage of a design event. By allowing anything less than full PMP/MCE, the states are tacitly accepting something other than near-zero risk. 10. Do not depend on emergency action plans or early warning systems to save lives. Time of failure, duration of failure, and complexity of evacuations prevent this from being a safety feature. An EAP is a response feature that will hopefully limit losses. Agreed 11. Risk analysis is not used in the design of new dams. Why is it appropriate for use on existing dams? Actually, in Washington and partially in Montana, our design standards are based on risk. However, this is still a good question. Methodology 1. The database of dam failures, when used to predict where problems will occur in the future, is not a strong tool. It is most likely a measure of past engineering standard deficiencies, undeveloped technology, or poor design and construction practices. It is not a measure of random phenomenon (i.e. piping is more likely in nature than rare storm events). Agreed. 2. The numbers used to calculate the probabilities used in risk analysis are subjective, leading to results that have a very weak link to actual probabilistic forecasts. Good engineering judgment and a proactive inspection program are much more reliable. This depends on which probabilities we are considering. For the triggering events such as floods and earthquakes, we can get fairly good statistical estimates of the probability, out to maybe 1 in 5,000 or even 1 in 10,000. For the other failure modes, I agree that they are subjective. However, engineering judgment is very subjective, isn’t it? 22 3.7 Consulting Engineer Information Needs - John W. France, URS Corporation, Denver, Colorado Consulting Engineer’s Roles: • Technical Adviser • Technical Problem Solver • Technical Advocate • Designer • Construction Manager Whose Risk is it Anyway? • Risks, and rewards, are the Owner’s. • Engineer needs to keep his risks balanced with his rewards. Standard of Care: • Services same as provided by similar professionals at the same time and same location. • Importance of established standards of practice for risk analysis. Research and Practice Needs: • Guidelines for risk assessment for dams: Standard of care • Greater acceptance of risk by the public and its representatives: buy-in • Establishing accepted levels of risk • Improved Tools • Loss of life estimates • Case history compilations • Expanded databases of failures and incidents • Methods for assessment of seepage risks • Verification/Confidence Building • Parallel risk assessments of same cases 23 4.0 Assessment of State of the Practice 4.1 Introduction Four risk assessment application areas were discussed at the workshop, ranging from qualitative to quantitative approaches, and progressing from more generalized approaches to approaches requiring more detailed analyses. The four application areas were as follows: • Failure Modes Identification Approaches (Qualitative Approaches) • Index Prioritization Approaches • Portfolio Risk Assessment Approaches • Detailed Quantitative Risk Assessment Approaches Lists of strengths and limitations of the four risk assessment application areas are presented in this section. Detailed participant input for each strength and limitation listed below is presented in Appendix I. Detailed input from individual participants was grouped into the listed categories using the procedure described in Section 2.2.2. These categories are listed in this section in decreasing order of the number of participant comments in each category. Bar charts of the number of comments for strengths and limitations are presented for each application area. As with other parts of this report it is not intended to include any commentary on results of participant input. 4.2 Failure Modes Identification Approaches (Qualitative Approaches) Failure Modes Identification (FMI) applied to a dam is a procedure by which potential failure modes are identified. A failure mode is a sequence of system response events, triggered by an initiating event, which could culminate in dam failure. Procedures for FMI vary, but in a typical approach, a small team of dam engineers, who have a knowledge of historical dam failure mechanisms, would develop a list of failure modes. The form of the FMI outcome may vary from simply the list of failure modes, to a tabulation that lists associated effects, consequences, compensating factors, and risk reduction measures. In some cases an event tree or other graphical representations failure modes may also be included. FMI normally does not include quantification of risks. It is therefore, by itself, not a risk analysis, although it is one of the first steps in performing a dam safety risk analysis. Examples of FMI, which were presented at the workshop by VonThun and Anderson, are included in the workshop proceedings. 4.2.1 Strengths Figure 4.1 shows the number of participant comments that were grouped under each of the following categories in descending order of the number of comments received in each category: • Failure modes paradigm • Relatively low effort • Broad interdisciplinary team approach • Enhances understanding • Wide acceptability • Strengthens traditional approach/Quality Assurance • Identifying additional information needs 24 • Aids in prioritization of issues • Aids in communicating risks • Tool for achieving integration of dam safety program • Aids in identification of risk reduction measures • Systematic approach 4.2.2 Limitations Figure 4.2 shows the number of participant comments that were grouped under each of the following categories in descending order of the number of comments received in each category: • Qualitative - risk, ranking, compare with other dams, confidence/uncertainty • Repeatability, consistency, influence of team members • Lack of available guidance • Cost • Limited case histories to use as basis for FM identification • Not a public-oriented process • Requires information on dam 4.3 Index Prioritization Approaches An index prioritization approach is a means of quickly ranking dams for addressing dam safety issues. The ranking is based on an index, calculated from a combination of weights, which are assigned to capture various attributes of identified dam safety deficiencies. The attributes and ranking procedures are usually prescribed in order to form a common basis for ranking between dams. These approaches are best used as an initial screening of a portfolio of dams, or a comparison to other forms of risk analysis. An example of an index prioritization approach that was presented at the workshop is the USBR's "Risk Based Profiling System" (USBR 2000). 4.3.1 Strengths Figure 4.3 show the number of participant comments that were grouped under each of the following categories, including a comparison with portfolio risk assessment, in descending order of the number of comments received in each category for index prioritization approaches: • Prioritization • Efficient process • Defensibility • Justification • Communication • Systematic process • Identification of dam safety issues • Integrates dam safety program and into overall business 25 0 2 4 6 8 10 12 14 16 18 20 Systematic approach Aids in identification of risk reduction measures Tool for achieving integration of dam safety program Aids in communicating risks Aids in prioritization of issues Identifying additional information needs Strengthens traditional approach/Quality Assurance Wide acceptability Enhances understanding Broad interdisciplinary team approach Relatively low effort Failure modes paradigm Strength Categories Number of participant comments Figure 4.1. Strengths of failure modes identification approaches. 0 2 4 6 8 10 12 14 16 18 20 Requires information on dam Not a public-oriented process Limited case histories to use as basis for FM identification Cost Lack of available guidance Repeatability, consistency, influence of team members Qualitative -risk, ranking, compare with other dams, confidence/uncertainty Limitation Category Number of participant comments Figure 4.2. Limitations of failure modes identification approaches. 26 4.3.2 Limitations Figure 4.4 show the number of participant comments that were grouped under each of the following categories, including a comparison with portfolio risk assessment, in descending order of the number of comments received in each category for index prioritization approaches: • Danger of misusing results • Not in-depth risk analysis • Lack of published guidance • Relative rather than absolute • Defensibility • Risk metric • No sign off 4.4 Portfolio Risk Assessment Approaches Portfolio risk assessment (PRA) involves the reconnaissance level application of the identification, estimation, and evaluation steps of dam safety risk assessment to a group of existing dams and risk reduction measures. The outcomes include an engineering standards assessment and risk profile for the existing dams, and a basis for developing and cost-effectively prioritizing risk reduction measures and supporting investigations. Other outcomes can be used to strengthen the owner’s monitoring and surveillance program, and to provide inputs to various business processes, such as capital budgeting, legal evaluations, loss financing, and contingency planning. An example of PRA was presented in the workshop based on (Bowles 1999). 4.4.1 Strengths Figure 4.3 show the number of participant comments that were grouped under each of the following categories, including a comparison with index prioritization, in descending order of the number of comments received in each category for index prioritization approaches: • Prioritization • Cost effectiveness risk reduction program • Justification • Communication • Defensibility • Risk metric • Efficient process • Identification of dam safety issues • Integrates dam safety program and into overall business • Systematic process 4.4.2 Limitations Figure 4.4 show the number of participant comments that were grouped under each of the following categories, including a comparison with index prioritization, in descending order of the number of comments received in each category for index prioritization approaches: 27 0 2 4 6 8 10 12 14 16 18 20 Cost effectiveness risk reduction program Risk metric Integrates dam safety program and into overall business Identification of dam safety issues Communication Systematic process Defensibility Justification Efficient process Prioritization Strengths Num ber of participant com m ents Portfolio Index Figure 4.3. Comparison of strengths of index prioritization and portfolio risk assessment approaches 0 2 4 6 8 10 1214 1618 20 Cost No sign off Defensibility Risk metric Lack of published guidance Relative rather than absolute Not in-depth risk analysis Danger of misusing results Limitations Number of participant comments Portfolio Index Figure 4.4. Comparison of limitations of index prioritization and portfolio risk assessment approaches. 28 • Danger of misusing results • Not in-depth risk analysis • Cost • Lack of published guidance • Defensibility • No sign off • Relative rather than absolute 4.5 Detailed Quantitative Risk Assessment Approaches A detailed quantitative risk assessment comprises the steps of risk identification, estimation, and evaluation. The purpose of performing a detailed quantitative risk assessment is typically to provide insights into the adequacy of an existing dam, or to provide justification for risk reduction measures. Different owners vary in the level of detail that they require, but none rely on risk assessment alone for making such decisions. Two examples of detailed quantitative risk assessments were given in the workshop based on Dise and Vick (2000) and McDonald (1998). 4.5.1 Strengths Figure 4.5 show the number of participant comments that were grouped under each of the following categories in descending order of the number of comments received in each category: • Valuable as a decision tool • Quantification using risk metric • Understanding of failure modes • Uncertainties considered • In-depth supporting analyses • Team process • Defensibility • Risk criteria evaluation • Transparency in engineering judgments 4.5.2 Limitations Figure 4.6 show the number of participant comments that were grouped under each of the following categories in descending order of the number of comments received in each category: • Lack of standardized procedure and experienced practitioners • Acceptable/tolerable risk criteria not agreed • Uncertainty in estimating probabilities and life loss • Communicating uncertainties to decision makers and others • Cost • New and complex terminology 29 0 2 4 6 8 10 12 14 16 18 20 Defensibility Risk criteria evaluation Transparency in engineering judgments In-depth supporting analyses Team process Uncertainties considered Understanding of failure modes Valuable as a decision tool Quantification using risk metric Strength Category Num ber of participant com m ents Figure 4.5. Strengths of detailed quantitative risk assessment approaches. 0 2 4 6 8 10 12 14 16 18 20 New and complex terminology Cost Communicating uncertainties to decision makers and others Acceptable/tolerable risk criteria not agreed Uncertainty in estimating probabilities and life loss Lack of standardized procedure and experienced practitioners Limitation Category Number of participant comments Figure 4.6. Limitations of detailed quantitative risk assessment approaches. 30 5.0 Technology Transfer and Training Needs The prioritized technology transfer and training (T3) needs categories resulting from the procedure described in Section 2.3 are listed in Table 5.1. The risk assessment application area for each need is indicated and some suggested modes of technology transfer and training suited to each need. A bar chart of the importance of T3 needs is presented in Figure 5.1 based on the number of votes for each need. Needs with less than three votes were omitted in this section, but are included in Appendix J. Numerical codes in the second column provide a means of tracking the categorization process that the group followed under the lead of the facilitator. Dam safety community should interact w ith DOE, NRC on QRA Produce a life safety criteria discussion paper, exhibit publicly and invite submissions Compilation of case histories Documentation of state-of-the-practice and training w orkshops Regular program for operator training More experience by more people Demonstration projects Risk indexing and prioritization approaches for state regulators and ow ners w ith limited resources. Tools for ow ners w ith limited resources Sharing experience on PRA w ith others, how w ell the process w orked, and w hat should be changed. Build FMI into standards based review s - this w ill economize resources Training in understanding probability and skills such as expert elicitation. Guidelines for w hat constitutes a Portfolio Risk Assessment and how it may be done Wider use of Failure Modes Identification thinking and current expertise in this area. T 3 Need Category Importance (number of votes) 0 5 10 15 20 25 30 35 Figure 5.1. Importance of T3 needs based on the number of votes for each need 31 Table 5.1. Prioritized Technology Transfer and Training Needs 32 6.0 Research and Development Needs 6.1 Introduction The prioritized research and development needs categories resulting from the procedure described in Section 2.3 are listed in Table 6.1. The table includes importance and difficulty votes and an assignment to one of the following quad categories: • Low Hanging Fruit - Easy and important • Strategic Items - Hard and important • Do later - Easy but less important • Consider - Hard and less important The decision quad, formed by the four quadrants of the “difficulty-importance” votes, is presented in Figure 6.3. Bar charts of the importance and difficulty of research categories are presented in Figures 6.2 and 6.3, respectively. Although not part of the original MetaPlan approach, research categories were ranked, separately within each quad category, by using a combination of the importance and difficulty votes, obtained as follows: Overall rank = i * (10 – d) in which: i = Number of votes received based on importance of research category d = Average score based on difficulty using a score between 0 and 10, with 0 being easy and 10 being really hard Ranking by this approach took place after the workshop and so, although it is based solely on the input of workshop participants, it was not available at the time of the workshop. The ICODS Research Subcommittee may find this ranking helpful, but should not feel bound by this within-quad category ranking when they select projects for funding. The priority assigned through this process to each of the research categories is shown in the first column of Table 6.1. A footnote in the first column for several research categories indicates that after all participant input was sorted by the facilitator, with the consensus of the participants; no input was left under these research categories. This may have occurred, for example, because a category was grouped with another category. The letters in the second column are a code that is used to refer to research categories. Other numerical codes are left in the description column in order to provide a means of tracking the categorization process that the group followed under the lead of the facilitator. In the Sections 6.2 – 6.5, respectively, the input provided by participants is listed for each of the research categories under each of the four decision quads. The input is reproduced as provided by participants with no attempt to interpret it or present it in a uniform format. 33 Table 6.1. Prioritized Research and Development Categories Priority Code Description Importance (i) Difficulty (d) i*(10-d) Category Interpretation of Category 1 2 F K 7,18,19 - Prioitization and Portfolio tools (F) 13 - Data Base of Failure Case Histories (K) 32 21 4.13 3.5 188 Low Hanging Fruit 137 Low Hanging Fruit Easy and Important Easy and Important 3 4 5 6 7 8 B M G I J C 2,6 - Tolerable Risk/Criteria (B) 15 - Flood Loading (M) 8 - Earthquake Response (G) 10,21 - Improve Loss of Life Estimates (I) 12 - Risk Communication (J) 3 - Subjective Probability (C) 49 26 26 24 22 20 7.14 5.9 6.5 6.25 6 5.7 140 Strategic Plan 107 Strategic Plan 91 Strategic Plan 90 Strategic Plan 88 Strategic Plan 86 Strategic Plan Hard and Important Hard and Important Hard and Important Hard and Important Hard and Important Hard and Important 9 10 11 12 a 14 15 a E N D A P H S T 5 - Uncertainity (E) 16 - Risk Process (N) 4 - Skills to Identify Failure Modes (D) 1 - Standards (A) 20 - Debate Mechanisms (P) 9 - Static Response (H) Portfolio - Learn to Improve (S) 26 - Debate Concepts (T) 14 13 7 5 3 1 0 0 4 4.8 2.9 3.4 4.1 3.33 3.79 3.4 84 Do Later 68 Do Later 50 Do Later 33 Do Later 18 Do Later 7 Do Later 0 Do Later 0 Do Later Easy but Less Important Easy but Less Important Easy but Less Important Easy but Less Important Easy but Less Important Easy but Less Important Easy but Less Important Easy but Less Important 17 L 14 - Earthqauke Loading (L) 13 6.40 47 Consider Hard and Less Important a O 17 - Analyze NPDP (O) 9 5.5 41 Consider Hard and Less Important a R 24 - Include Failure Modes Identifcation in schools (R) 7 6.65 23 Consider Hard and Less Important a Q 22 - Communicate Best Practice (Q) 1 5 5 Consider Hard and Less Important a) No input was provided by workshop participants on these needs and so they were dropped from the list of priorities b) Needs with descriptions in bold were developed into a brief research proposal at the workshop. 34 0 10 20 30 40 50 60 Portfolio - Learn to Improve Debate Concepts Static Response Communicate Best Practice Debate Mechanisms Standards Skills to Identify Failure Modes Include Failure Modes Identifcation in schools Analyze NPDP Earthqauke Loading Risk Process Uncertainity Subjective Probability Data Base of Failure Case Histories Risk Communication Improve Loss of Life Estimates Earthquake Response Flood Loading Prioitization and Portfolio Tools Tolerable Risk/Criteria Research Category Importance of Research Category (number of votes) Figure 6.1. Bar Chart for Importance of Research Category 35 0 2 4 6 8 Skillsto IdentifyFailureM odes StaticResponse DebateConcepts Standards DataBaseofFailureCaseHistories Portfolio -Learnto Improve Uncertainity DebateM echanisms Prioitizationand Portfolio Tools RiskProcess CommunicateBestPractice AnalyzeNPDP SubjectiveProbability Flood Loading RiskCommunication ImproveLossofLifeEstimates EarthqaukeLoading EarthquakeResponse IncludeFailureM odesIdentifcationinschools TolerableRisk/Criteria Research Category Difficulty of Research Category (average score) Figure 6.2. Bar Chart for Difficulty of Research Category 36 Importance B Decision Quad 35 A C D F G H I J K L M N O P Q R ST 0 5 10 Difficulty Low Hanging Fruit Strategic Plan Do Later Consider 17.5 0 Figure 6.3. Decision Quad for Research Categories 37 Legend Standards Tolerable Risk/Criteria Subjective Probability Skills to Identify Failure Modes Uncertainity Prioitization and Portfolio Tools Earthquake Response Static Response Improve Loss of Life Estimates Risk Communication E Data Base of Failure Case Histories Earthqauke Loading Flood Loading Risk Process Analyze NPDP 6.2 Low Hanging Fruit - Easy and Important 6.2.1 Priority 1 – (7, 18, 19) Prioritization and portfolio tools (F) • Develop guidelines for prioritization and portfolio approach • Develop simple, easy to use approach that will gain general acceptance • Most state dam safety programs have no program for profiling and prioritization. Consider developing index system that state dam safety programs could use for profiling dams that they regulate • Check USBR index system against portfolio method and try to assess how effective it is and whether it is good for state officials • Are rating points systems worth doing without Failure Modes Identification procedure? There is a high chance of missing the critical issue. • Can a prioritization index system be consistent with a risk (metric) analysis approach? • Can portfolio risk assessment be used for prioritization of known deficiencies (e.g. as opposed to USBR prioritization)? 6.2.2 Priority 2 – (13) Database of failure case histories (K) • Case history compilations needs to be parameter specific 6.3 Strategic Plan - Hard and Important 6.3.1 Priority 3 – (2, 6) Tolerable risk/criteria (B) • Is legislative intent to get to zero risk to life? • State legislation says, “remove the risk”; implies that there could be zero risk. Not possible. • Regulators need to educate government that “safe” means a low probability of failure, not "no chance" of failure. • Public aversion or intolerance to imposed risks • The public is extremely risk adverse about dams. How can you get acceptance of risk levels given that? • Who decides “RP” in ALARP? • Who decides what is tolerable risk for dams? • How do we get public input for risk criteria/public protection guidelines? • Who will (should) establish life safety criteria? Is it practical for them to do so? • Obtain public & political input to debate on acceptance limits • Tolerable risk criteria as an interim step on the constant path of risk reduction • Accepted level of risk is an organization to organization, case by case, aspect • The FEMA requirements are impossible if followed rigidly • Legislators, not the Regulators should determine risk level accepted. • Dams are only one piece of society’s risk pool. • Strive for consistent risk. • Is it reasonable to rely on warnings and evacuation as a risk reduction measure? • EAP vs. fixing dam • EAPs not a substitute for structural fix • Engineers + Lawyers = inferior dam safety decisions 38 • Who could go to jail if the dam fails? • Acceptance of loss of life? • If risk is the owner’s what does this mean for non-owner beneficiaries to share risk? • <1 lives/yr does not communicate with the public. Why aren’t we looking at calculating the probability of one or more lives lost by a particular event, then ask what the acceptable probability for public would be? • What to do if repair may pose more risk than existing conditions (no-fix)? 6.3.2 Priority 4 – (15) Flood loading (M) • Regional analyses of extreme precipitation probabilities for entire U.S. – allows states to estimate % PMP probabilities • Extreme event probability determination improvement • Reduce uncertainty in hydrologic process evaluation • Continued support for development of methods for processing hydrologic information for characterizing extreme floods • Development of procedures for better understanding and incorporating uncertainty in characterization of floods • Comprehensive program for collection of climate flood and paleoflood data on regional basis to support regional analyses • Studies to investigate spatial distribution for large watersheds using probabilistic methods • Confidence in extreme event estimates • Variability in PMF computations of uncertainty of parameters 6.3.3 Priority 5 – (8) Earthquake response (G) • Develop more realistic earthquake displacement and liquefaction models. • Develop better methods for structural response of: - Concrete gravity dams in earthquake - Embankment stability - Piping, static and post earthquake • RA is very good where there is no standards-based analytical tool e.g. Navaho Drain Tunnel • Factor of safety vs. probability of failure. Need conservative strengths for FS = 1.5 to represent low probability of failure. • Inter-related failure modes • Does number of steps included in event tree fundamentally affect resulting probability? • Length of dam and number (?) effects on probability estimates • Develop capability to derive failure probability analytically • Need to improve understanding and ways to predict system response probabilities • Failure mechanism understanding and modeling • Develop failure models that use probabilistic input both for loads and resistance • Adapt failure models for nodes of event trees 6.3.4 Priority 6 – (10, 21) Improve loss of life estimates (I) • Improve life loss estimation • LOL estimate should consider EAP • Assessment of evacuation capability for large population centers 39 • Develop procedures to understand and assess the effectiveness of EAP/EPP • Role of EAP in loss of life estimates • Long term effectiveness of warning and evacuation systems • Relationship between life loss and proximity to the dam? • Improve confidence of loss of life estimates 6.3.5 Priority 7 – (12) Risk communication (J) • What do the numbers resulting from quantitative risk analysis really mean? • Hazard (seismic), Hazard (downstream): 1) drop both uses, 2) use Seismic loading, 3) use consequence • Can we build public confidence in life loss estimates? • Owners: be able to defend what you are doing as being reasonable and prudent • Need for common language between technical specialists and international (English (geotech) . English (financial) . English (probability) . English (international) . English (seismic) . English (H&H) . English (owner) . English (lawyers). • Public buy-in for risk-based decisions 6.3.6 Priority 8 – (3) Subjective probability (C) A. Immediate • Develop an improved understanding of probability interpretations and corresponding expectations of those using, interpreting, or considering quantitative methods. • Develop better ways for adapting criteria to probability (rather than vice-versa) and operating within its capabilities. B. Intermediate-term • Education and training of probability assessors in cognitive processes, heuristics and biases. • Development and application of de-biasing techniques adapted in positive ways to how people think and how they conceptualize subjective uncertainty judgments. • Education and training in basic probability theory (axioms, etc.) C. Longer-term • Improve judgment of probability assessor - What is judgment? - How does substantive expertise differ from normative expertise? - Role of inductive vs. deductive reasoning strategies - How is judgment enhanced? • Adapt and merge ongoing R&D from institutions, e.g. Stanford University regarding human thought processes • What is the value to the public of subjective probability estimates? • Dam response probability subjective estimate divergence theory: If team thinks failure mode is a problem based on discussion, then the subjective value is higher. If team thinks failure mode is not a problem then subjective estimate is lower. • Effects of distributions on event probability estimates. • Uncertainty analysis approaches beginning from probability estimation, failure mode identification through presentation of outcomes to decision makers • Compare on equal basis judgment and unknowns for loads, responses and life loss. • Assess repeatability – considering uncertainty ranges (not just point estimates) • How do we reflect uncertainty in perfect history database? 40 • How do amount and quality of data affect confidence in RA results? 6.4 Do Later - Easy but Less Important 6.4.1 Priority 9 – (5) Uncertainty (E) • LUMPED WITH PRIORITY 9 6.4.2 Priority 10 – (16) Risk process (N) • Long dams; multiple dam reservoirs need probabilistic concepts to be ‘correct’ • Repeatability: (even for qualitative methods) 6.4.3 Priority 11 – (4) Skills to identify failure modes (D) • Change paradigm for quantitative risk analysis 6.4.4 Priority 12 – (1) Standards (A) • All Civil Engineering is empirical, therefore, it is risk based! FS = 1.5 means low risk, not zero risk. • How do the new computer tools encroach on FS in standards based designs and how does this change 100 yr database? • ~ 1 in 100 dams fail • How do we change standards without addressing risk? • Dams with no possibility of life loss • Large dams that must meet PMF and MCE • What is a “reasonable FS”? Is the MCE adequately conservative? • Parallel risk assessments of the same dam • Incentive/need to undertake risk assessment if dams meet standards regulations • Is a standards approach a zero risk approach? • Subjective probabilities bad for quantitative RA but OK for standards? • Standards . restrictive thinking • Failure modes identification should always be performed • Missing failure modes • Also a problem with defensibility of standards • How do engineering/subjective judgments affect traditional approach outcomes vs. risk-based approach outcomes? • Risk seems to add to short comings of standards approach as opposed to avoid (parameter uncertainty analysis) • New dams vs. existing dams 6.4.5 Priority 13 – (9) Static response (H) • Improve estimates of failure probabilities for static stability piping failure, etc. • Research needed to develop better models for: - Failure - Piping 41 - Loss of life • How confident are we in characterization of piping failures – embankment, foundation, etc.? • Seepage rate is not a good guide to problems. Changes, not correlating with reservoir is better. • Piping failures take less than 24 hrs, mostly < 6 hrs, to develop. They historically occur at reservoir level = 1m below historic high level. • Develop risk analysis procedures to account for time-dependent aspects of piping. • RA is very good where there is no standards based analytical tool e.g. Navaho Dam Tunnel • Factor of safety vs. probability of failure. Need conservative strengths for FS = 1.5 to represent low probability of failure. • Inter-related failure modes • Does number of steps included in event tree fundamentally affect resulting probability? • Length of dam and number (?) effects on probability estimates • Develop capability to derive failure probability analytically • Need to improve understanding/ways to predict system response probabilities • Failure mechanism understanding and modeling • Develop failure models that use probabilistic input both for loads and resistance • Adapt failure models for nodes of event trees 6.4.6 Priority 14 - Portfolio - Learn to improve (S) • Learn how to improve PRA by evaluating changes resulting from updating • More input from users on information needs 6.5 Consider - Hard and Less Important 6.5.1 Priority 15 - Earthquake loading (L) • Reduce uncertainty and minimize compounding of conservatism in earthquake risk assessment • Earthquake loads need: - Additional data collection – slip rates - Site response data - Recurrence models - Robust estimates of time histories for use in RA - Better integration with engineering analyses - Portray uncertainty in an understandable fashion • Characterize AEP of earthquake loading using magnitude as well pga • Reduce errors in catalogue of recorded earthquake accelerations (data cleaning) • Uncertainties in recurrence characteristics for known faults 6.6 Research Proposals Following the format of the worksheet presented in Figure 2.2, small groups of participants prepared some suggestions for several of the higher priority research categories for consideration by the ICODS Research Subcommittee. Time was quite limited for this activity. A recommendation would be that more time be assigned to this activity in future Research Specialty Workshops. 42 The completed worksheets are presented below for eight research categories. These are indicated in a bold typeface in Table 6.1. Content varies depending on the group that prepared them. 43 Title: PRIORITY 1 -Develop Guidelines for profiling and prioritization system Description: a. Why is this a priority research item? - For most state dam safety agencies, no prioritization system in place - Provide for improvement in overall national dam safety by providing tool for states to prioritize unsafe dams - Allows for a national assessment of safety of dams and to show year to year improvement for the NDSP (National Dam Safety Program) b. What is the expected outcome? - Greater efficiency in fixing dams that pose greatest risk - Improved national dam safety - Project Tasks and Needs: 1. Hire contractor 2. Compile info on existing prioritization systems 3. Consult with state agencies and other dam safety agencies 4. Develop guidelines 5. Peer review 6. Publish Project Lead and Contract: a. Who is working in this area? USBR, Australia, Washington State, Utah State University, Corps of Engineers b. Who might be able to lead the project? FEMA . ASDSO steering committee c. Who are good candidates to complete the work? Marty McCann, Stanford University David Bowles, Utah State University USBR Corps 44 Title: PRIORITY 3 -Tolerable Risk Criteria Description: Develop an approach for setting of tolerable risk criteria for the various classes of dam failure consequence a. Why is this a priority research item? Because it is essential to development of the full potential of risk assessment for dams. b. What is the expected outcome? An approach that will facilitate the setting of tolerable risk criteria that will have a good level of acceptance. Project Tasks and Needs: 1. Research approaches to the setting of tolerable risk levels in other industries and other countries 2. Research approaches, in other industries and countries, to gaining acceptance for criteria 3. Research legislative and regulatory intent and approaches to amendment of legislation Project Lead and Contract 1. FEMA 2. Bowles - USU 45 Title: PRIORITY 4 -Flood Loading Description: Development of estimates of probabilities for extreme flood events/ Needed for both site- specific studies and portfolio approaches. Project Tasks and Needs: • Investigate spatial distribution of precipitation/ floods for a variety of basin sizes. • Incorporate bounds • Develop meaningful uncertainty estimates incorporating model and parametric uncertainties. • Regional analysis of extreme precipitation events to relate existing state safety criteria to AEP. • Develop program of collection of climate, flood, and paleoflood data on a regional basis to support regional analyses. Project Lead and Contract: Mel Schaefer – MGS Dave Goldman – USCOE Dan Levish – USBR Jerry Stedinger – Cornell 46 Title: PRIORITY 5 -Develop new structural method of calculating probability of failure from probabilistic dynamic loading and dynamic strength values Description: a. Why is this a priority research item? Probability of failure cannot be reliably estimated b. What is the expected outcome? More reliable methods for estimates Embankment dams – Liquefaction and non-liquefaction induced deformations and seepage erosion and piping Concrete and masonry gravity dams – The probability and extent of displacement and damage including where the dam is cracked, displaced but may not lead to break. Project Tasks and Needs: Embankment dams – Case studies for details of deformation and cracking - Tying together the state of art in liquefaction, post liquefaction strength and deformations -Linking to piping Concrete and masonry gravity dams - Simplified displacement method based on a Newmark type analysis Project Lead and Contract: Embankment liquefaction – Utah State (Loren Anderson) Bureau of Reclamation Corps of Engineers Concrete and masonry - Chopra at UC Berkeley Bureau and Corps 47 Title: PRIORITY 6 -Loss of life estimates Description: a. Why is this a priority research item? • Public safety is paramount • A main criteria for decision making • No accepted practice today b. What is the expected outcome? Improve effectiveness of EAP Project Tasks and Needs: Start with Graham ’99 method. Assemble a qualified group to critically review. Evaluate and specify improvements (if required) to the method. Publish and publicize this method Project Lead and Contract: a. Who is working in this area? Wayne Graham, USBR Utah State, David Bowles & Duane McClelland BC Hydro, Al Imrie (contact person) b. Who might be able to lead the project? One of the above – group to determine c. Who are good candidates to complete the work? Wayne Graham, USBR Utah State, David Bowles & Duane McClelland BC Hydro, Al Imrie (contact person) Note that USU-USBR-BC Hydro are coordinating R&D activities in this area. USU is currently funding by Corps/USBR/ANCOLD, but additional funding is needed to complete case histories characterizations and life loss model development. 48 Title: PRIORITY 6 -Early Warning Systems (Advance Indication of Incipient Dam Failure) – EAP Description: a. Why is this a priority research item? Public safety is paramount – early warning can save lives b. What is the expected outcome? Earlier notification of emergency response officials who are responsible for evacuation of public. Project Tasks and Needs: Define critical parameters to monitor, technologies to improve monitoring (the assumption is that a process for conducting FMEA will already be developed) Project Lead and Contract CEA Dam Safety Interest Group (for embankment dams) - Project underway to identify anomalies in embankment dams using geophysical techniques Cross reference to static response priority # 9. 49 Title: PRIORITY 8 -Subjective Probability, Engineering Judgment and Inductive Processes Description: a. Why is this a priority research item? Risk Assessment Relies on Quantifying Subjective Judgment b. What is the expected outcome? Enhanced quality of RA results Project Tasks and Needs: Develop understanding of probability interpretations in engineering context Develop understanding of cognitive processes in engineering context Develop understanding of engineering judgment Develop understanding of inductive reasoning in engineering context Develop understanding of heuristics and biases in engineering context Develop understanding of de-biasing techniques in engineering context Project Lead and Contract a. Who is working in this area? S. Vick, C. Papay (Bechtel), various cognitive psychologists b. Who might be able to lead the project? S. Vick c. Who are good candidates to complete the work? S. Vick 50 Title: PRIORITY 13 -Develop new structural methods of calculating probability of failure from probabilistic loads and resistance values Description: a. Why is this a priority research item? Probability of failure cannot be calculated reliably b. What is the expected outcome? Methods for calculating probability of failure Embankment dams – piping, slope stability, and combined Concrete and masonry gravity dams – sliding, piping, and overtopping scour for both Embankment and Concrete. Project Tasks and Needs: Embankment dams Piping – Exclusive laboratory erosion testing Case study decomposition Estimation of erosion (all modes) Slope stability - Develop practical methods from the available methods (incorporating spatial variability and foundation geological factors) Concrete and masonry gravity dams Uncertainty in the geometry, … …, shear and tensile strengths, and uplift and 3D effects Piping - (covered in embankment) Project Lead and Contact: Embankment dams – Piping UNSW (R. Fell) Corps of Engineers (Art Waltz) Embankment dams – Slope stability Utah State (Loren Anderson) Maryland (G. Baecher) Concrete and masonry gravity dams UNSW (R. Fell, K. Douglas) Shear strength of rock and a little on concrete strength - Corps of Engineers - Chopra at U.C. Berkeley - David Goodman, HEC/Corps Others 51 7.0 Integrated Approach to Meeting Research Needs From an examination of the 14 T3 and 15 R&D prioritized needs listed in Tables 5.1 and 6.1, respectively, it can be seen that there are common topics amongst the different needs. Table 7.1 is an attempt to group the T3 and R&D needs based on topics and risk assessment application areas. Based on the grouping in Table 7.1, 12 integrated projects have been identified. Each project combines both R&D and T3 needs. These projects are listed in Table 7.2, which shows the individual T3 and R&D needs that are grouped together to form the integrated projects. They are listed in order of the highest priority T3 or R&D need grouped under each integrated project, as determined by workshop participant voting. Additional assumptions were made in developing the list of integrated projects is as follows: 1) The working group questioned if T-6 (Failure Modes Identification Tools for owners with limited resources) is achievable. They felt that owners should hire a qualified engineer rather than rely on tools alone. 2) The working group felt that T-10 (Regular program for operator training in Failure Modes Identification) is responsibility of individual owners. 3) The working group felt that T-3 (Training in understanding probability and skills such as expert elicitation) should be blended with other training for profession and the B.S. Civil Engineering curriculum. The integrated project approach has the advantage of addressing related aspects of a topic first with research and then with T3 activities that are linked to the research outcomes to disseminate them amongst the dam engineering community. 52 Table 7.1. Relationship between Integrated Projects (I - XII), Risk Assessment Application Areas, and Separate T3 and R&D Projects Topics Risk Assessment Application Areas Failure modes identification Index approaches Portfolio risk assessment Detailed quantitative risk assessment Research Training Research Training Research Training Research Training Guidelines I(R-11) I(T-1) I(T-4) T-6b) II(R-1) III(T-2) IX(R-8) IX(R-9) X(R-10) X(T-11) X(T-14) Case histories R-2a) I(T-1) I(T-4) II(T-7) III(R-14) III(T-5) X(T-12) Loading probabilities estimation V(R-4) XII(R-15) Response probabilities estimation VI(R-5) XI(R-13) Life loss estimation VII(R-6) Tolerable risk guidelines IV(R-3) IV( R-12) IV(T-13) IV( T-14) Risk communication VIII(R-7) Training I(T-1) I(T-4) II(T-9) III(T-9) X( T-11) Demonstration projects II(T-9) III(T-2) III(T-9) X(T-8) a) Assumed that NPDP is funded b) Questionable if achievable - need to hire a qualified engineer c) T-10 is responsibility of owner d) Blend T-3 training with other training for profession PLUS for BS Curriculum 53 Table 7.2. Integrated T3 and R&D Projects 54 8.0 References AS/NZS. 1995. Risk Management. Australian/New Zealand Standard, AS/NZS 4360, Stathfield, New South Wales, Australia, and Wellington, New Zealand. Bowles, D.S., L.R. Anderson, J.B. Evelyn, T.F. Glover and D.M. Van Dorpe. 1999. Alamo Dam Demonstration Risk Assessment. Presentation at the 1999 ASDSO 16th Annual Conference, St. Louis, Missouri. October 1999. Bowles, D.S., A.M. Parsons, L.R. Anderson and T.F. Glover. 1999. Portfolio Risk Assessment of SA Water’s Large Dams. ANCOLD (Australian Committee on Large Dams) Bulletin 112:27-39. August. Dise, K.M., and S.G. Vick. 2000. Dam Safety Risk Analysis for Navajo Dam. Proceedings of the 20th International Commission on Large Dams (ICOLD) Congress, Beijing, China. September. ICOLD. 2000. Bulletin On Risk Assessment As An Aid To Dam Safety Management: Principles, Terminology and Discussion of Current and Potential Roles. Draft Version 10. August. McDonald, L.A. and C.F. Wan. 1998. Risk Assessment for Hume Dam – Lessons from Estimating the Chance of failure. Proceedings of the 1998 Australian Committee on Large Dams (ANCOLD) Annual Meeting, Sydney, N.S.W., Australia. September. National Research Council. 1983. A Framework for Characterizing Extreme Floods for Dam Safety Risk Assessment USBR. 2000. Risk Based Profiling System. U.S. Bureau of Reclamation, technical services Center, Denver, Colorado. February. 55 Appendices 56 Appendix A. Workshop Agenda 57 Appendix B. List of Participants Name Affiliation Address Ake, Jon Akridge, Mike Anderson, Loren Bahleda, Mike Bechai, Mona Bowles, David Chauhan, Sanjay Cyganiewicz, John Davis, Al Doane, Jim Dupak, Dan Fell, Robin France, John Glover, Terry Hampton, Terry Harris, David Johnson, Doug Lindon, Matt Mahoney, Dan Marshall, Kevin McDonald, Len Salmon, Gary Schaefer, Mel Smart, John Smith, Grant Tarbox, Glenn Tjoumas, Gus Verigin, Steve Vick, Steve VonThun, Larry Zeizel, Gene USBR Southern Services, Alabama Power Utah State University/RAC Engineers & Economists EPRI Ontario Power Generation Utah State University/RAC Engineers & Economists Utah State University/RAC Engineers & Economists USBR Alton P. Davis Jr. Consultant Portland Water Bureau Ontario Power Generation University of New South Wales URS Greiner Woodward Clyde Utah State University/RAC Engineers & Economists Mead & Hunt USBR Washington State/ASDSO State of Utah FERC Portland General Electric L.A. McDonald coordinator, dam safety interest group MGS Engineering USBR Ontario Power Generation Harza Engineering FERC Design Engineering Consultant Consultant FEMA PO Box 25007 D-6600, Denver, CO 80225 PO Box 2641, 16N-0380, Birmingham, AL 35291 Civil & Environmental Engineering, Utah State University, Logan UT 843223412 Hillview Ave., Palo Alto, CA 94304 700 University Ave., Toronto, Ontario M5G 1X6, Canada Utah Water Research Laboratory, Utah State University, Logan UT 84322-8200 Utah Water Research Laboratory, Utah State University, Logan UT 84322-8200 PO Box 25007 D-8311, Denver, CO 80225 12 Old Mill Road, PO Box 223, W Ossipee NH 03890 1120 SW 5th Ave, Room 600, Portland, OR 97204-1926 700 University Ave., Toronto, Ontario M5G 1X6, Canada University of New South Wales, Syndye NSW Australia Stanford Place 3, Ste 1000, 4582 S Ulster St Pkwy, Denver, CO 80237 Economics Dept., Logan, UT 84322-3530 6501 Watts Rd, Ste 101, Madison, WI 53719 PO Box 25007 D-8180, Denver, CO 80225 Dept. of Ecology, PO Box 47600, Olympia, WA 98504-7600 Div. Of Water Rights, PO Box 146300, SLC UT 84114-6300 888 1st Street NE, Rm 61-05, Washington, DC 20426 121 SW Salmon St., Portland, OR 97204 6 Kiama St, Greystanes NSW 2145 AUSTRALIA 1251 Clyde Ave., West Vancouver, BC, CANADA V7T 1E6 7326 Boston Harbor Rd NE, Olympia, WA 98506 700 University Ave., Toronto, Ontario M5G 1X6, Canada 2353 130th Ave, NE, Ste. 200, Bellevue, WA 98005 888 1st Street NE, Rm 6A-11, Washington, DC 20426 CA Division of Safety of Dams, 2200 X St. Ste 200, Sacramento, CA 95818 42 Holmes Gulch Way, Bailey, CO 80421 820 S Estes St., Lakewood, CO 80226 MTTS 500 "C" St. SW, Rm. 418, Washington, DC 20472 58 Appendix C. List of Handouts Item # Speaker Description Workshop Agenda List of Attendees Section 1.0 INTRODUCTION 1a D. Bowles PowerPoint (PP) presentation - Workshop Objectives 1b D. Bowles PP Presentation - Framework for and types of Risk Assessment 2 Paper - "The Practice of Dam Safety Risk Assessment and Management: Its Roots, Its Brances, and Its Fruit" 3 Paper - "A Role for Risk Assessment in Dam Safety Management" 4 Paper - "Understanding and Managing the Risks of Agin Dams: Principles and Case Studies" 5 Report - "Dam Safety Risk Analysis Methodology" by USBR 6 Paper - "Engineering Application of Dam Safety Risk Analysis" by S. Vick 7 J. Smart Overhead - "Government Owner Information Needs" 8 D. Bowles PP Presentation - "Large Private Owners" 9 J. Doane Handout - "The Perspective of the Small Dam Owner" 10 D. Mahoney PP Presentation - "What Regulators Need" 11a S. Verigin Handout - "ASDSO/FEMA Specialty Workshop Risk Assessment for Dams" 11b Handout - Risk and Liability 12 Handout - Comments by Doug Johnson 13 J. France Overhead - "Information Needs for Dam Safety Evaluation and Management - Engineer's Perspective" Section 2.0 QUALITATIVE APPROACHES Sub-Section 2.1 State of the Practice 14 L. VonThun Overheads - "A Qualitative Approach - FMEA+Failure Mode and Effects Analyses+" 15 Handout - "Broad Based Approach to Dam Safety Risk Assessment" Sub-Section 2.2 Examples 16 L. VonThun Handout - "Experiences and Results from FMEA's Case A - Composite Embankment and Gravity Dam" 17 L. Anderson PP Presentation - "Framework Components" 18 D. Dupak Handout - "An Owner's Experience with FMEA" 19 D. Bowles PP Presentation - "Use of Information from Qualitative Approaches" Sub-Section 2.3 Consensus Building Section 3.0 QUANTITATIVE APPROACHES 59 Item # Speaker Description Sub-Section 3.1 State of the Practice 20 M. Schaefer Presentation -"Estimating Probabilities of Extreme Floods" 21 Paper - "A Framework for Characterization of Extreme Floods for Dam Safety Risk Assessment" by R. Swain, et al. 22 Paper -"A Probability-Neutral Approach to the Estimation of Design Snowmelt Floods" by R. Nathan and D. Bowles 23 J. Ake Presentation -"Development of Probabilistic Earthquake Loading Functions for Use in Dam Safety Evaluations" 24 R. Fell Handout - "Quantitative Risk Assessment of Dams Estimation of Probabilities of Failure" 25 S. Vick Handout - "Structural Response and Role of Subjective Probability" 26 Report - "Considerations for Estimating Structural Response Probabilities in Dam Safety Risk Analysis" 27 D. Bowles (J. Lewin) Paper - "Hydraulic Water Control Structures for Dams - How Reliable?" ICODS Technical Seminar by J. Lewin 28 T. Glover Overheads - "Damage Assessment" 29 D. Bowles PP Presentation - "Life Loss Estimation" 30 Paper - "Life-Loss Estimation: What Can We Learn from Case Histories?" 31 Paper -"A Procedure for Estimating Loss of Life Caused by Dam Failure" by W. Graham USBR 32 D. Bowles PP Presentation - "Tolerable Risk Criteria/Public Protection Guidelines" 33 Overhead - "Dam Safety Risk Based Dam Safety Criteria and Guidelines" 34 Paper - "Guidelines for Achieving Public Protection in Dam Safety Decision Making" USBR Sub-Section 3.2 Examples J. Cyganiwicz 35 J. Cyganiewicz & S. Vick Overhead - "Navajo Dam Risk Analysis 1998" 36 Paper - "Dam Safety Risk Analysis for Navajo Dam" by K. Dise and S. Vick 37 L. McDonald Handout - "Case Study - Australia, Intitial Phase of Risk Assessment" 38 D. Johnson Handout - "Application of Risk Concepts in a Standards-Based Framework for Dam Safety in the State of Washington" 39 Paper -"Alamo Dam Demonstration Risk Assessment" by D. Bowles, et al. 40 Group of 5 Papers -"Dam Safety Evaluation for a Series of Utah Power and Light Hydropower Dams, Including Risk Assessment" 41 L. McDonald Handout - "A Regulator's Perspective and Experience with Risk Assessment for Dams" 42 Handout - "Areas for Improvement, Based on Experience with Risk Assessment for Dams in Victoria, Australia" by D. Watson 43 Memorandum - "Subject: Advice - Liability -Risk Assessment" by N. Himsley 44 Paper - "ANCOLD Guidelines on Risk Assessment Position Paper on Revised Criteria for Acceptable Risk to Life" by ANCOLD Working Group on Risk Assessment Sub-Section 3.3 Consensus Building Section 4.0 PRIORITIZATION & PORTFOLIO APPROACHES Sub-Section 4.1 State of the Practice and Examples 45 J. Cyganieqicz Bound report insert -"Risk Based Profiling System" USBR 46 D. Johnson Handout - "Commentary on Algorithm for Prioritization Ranking of Dams with Safety Deficiencies" 47 J. Doane Overhead -"Portland Oregon's Experience with Risk Assessment" 48 J. Doane Handout - "Portland Oregon's Experience with Risk Assessment" 49 D. Bowles PP Presentation - "Portfolio Approaches: Principles and Case Study" 50 Paper - "Portfolio Risk Assessment: A Tool for Dam Safety Risk Management" 51 Paper - "Portfolio Risk Assessment: A Basis for Prioritizing and Coordinating Dam Safety Activities" Sub-Section 4.2 Consensus Building Section 5.0 CONSOLIDATION OF OUTCOME Sub-Section 5.1 ASDSO/FEMA Report 52 Revised Proposed Outline -USCOLD White Paper Sub-Section 5.2 USCOLD White Paper Section 6.0 BIBLIOGRAPHY 53 Draft Bibliography: Risk Assessment for Dams 60 Appendix D. Participants Expectations and Issues D.1 Expectations Blending FMEA into standards based dam safety program Recognize and acknowledge different needs for different strengths. A prioritized list of risk assessment, research needs, and who will conduct the studies. Help owners (large majority) and engineers get value from risk assessment. Hear state of practice view of the non-believers and help inform. Sniff the other dogs. Identify benefits of using risk-methodology in state programs. Identified data sources. Took risk from gut to head. Attached risk component to federal funds to states. Understanding regulator’s perspective. Move towards understanding of state and practices. Did not write guidelines. State of practice, strengths/weaknesses, where can apply how, research needs, how to strengthen, how to facilitate others using it. Ideas to improve my dam safety program. Identified sources of fear Identify areas where risk research would benefit states. Help other uncomfortable with risk concepts betters understand them. Viewpoints of regulators and owners. Understood how to “sell” the concept back home. Identified areas of collaboration. Does practiced mean right! Brought to light issues affecting RA. Catch 22, you don’t know, I won’t vie you the money to find out. Identified research needs to better explain options to the public. Improve knowledge on FMEA. To learn, to gain acceptance of RA. Developed necessary perspectives. Began to discuss role of subjective probabilities in quantitative RA. Compare what we are doing to what others are doing, looking for different ideas. How to develop a standardized RA method so the general profession can adopt and use it. We found out how RA will develop. Update on state-of-the practice. Consensus on priority research needs. D.2 Issues Major benefit from getting a team approach? Still requires a standard process. It is reasonable to rely on warning and evacuation as a risk reduction measure. Risk seems to add to shortcomings of standards approach as opposed to avoid. Parameter uncertainty analysis. Change paradigm for quantitative risk analysis. Who could go to jail if the dam fails? About 1 in 100 dams fails. 61 Regulators need to educate government that “safe” means a low probability of failure, not "no chance" of failure. The FEMA requirements are impossible if followed rigidly. Phase I . Phase II FMEA . RAM . RAS Fix Remove Dam? Technical advocate as consulting engineer is a valid concern. Example of a lot of calculation at Keenley Side Dam. Parallel risk assessments of the same dam. EAP vs. fixing dam. How can RA benefit owners of one or a few dams? Now dams vs. existing dams. Missing failure modes. Also a problem with defensibility of standards. Incentive/need to undertake risk assessment if dams meet standards/regulations. Repeatability (even for qualitative methods). How do we change standards without addressing risk? Legislature, not the regulator should determine risk level accepted. Dams are only one piece of society‘s risk pool. Strive for consistent risk. All civil engineering is empirical. It is risk-based! FS = 1.5 means low risk, not zero risk. Standards: What is a reasonable FS? Is the MCE adequately conservative? Dams with no possibility of life loss. Large dams that must meet PMF/MCE. The public is extremely risk adverse about dams. How can you get acceptance of risk levels given that? Is standard approach a zero risk approach? Acceptance of loss of life. Engineers and lawyers = inferior dam safety decisions. Is legislative intent to get to zero risk to life? Profiling, Portfolios, ?? (classifications) all require quantification. Prioritization means some things are not done. Standards are not restrictive thinking. Failure modes should always be considered. Accepted levels of risk are an organization-by-organization case-by-case aspect. The risk is the owner’s, what is the means for non-owner beneficiaries to share the risk. Owner be able to defend what you are doing as being reasonable and prudent. Case history compilations need to be parameter specific. Ultimately public must buy into risk. Right now if an individual is financially involved, risk is considered. If the owner is the financial source, the public wants zero risk. State legislation says ‘remove the risk’, implies that there could be zero risk. Not possible. Subjective probabilities bad for quantitative RA but OK for standards? D.3 How others can use it? (Technology Transfer and Training Needs?) Build FMEA into standards based reviews—economy of resources. Someone (FEMA/ASDSO/ICODS)? Should develop a “methodology” that tries to standardize the process. Do dams in groups with same experts. 62 Need ways to get limited expertise applied more broadly. Regular program for operator training. Focus on integration with existing efforts. Review case histories. Failure mode thinking. Documented case histories. Training seminars. Hands-on workshops. Systematic approach—list elements and ask how can find. RAC could share some of their failure mode spreadsheets with the rest of us. Documented reports of use. Focus on integration with existing efforts. Tools for owners with limited resources. Get smaller group of experienced FMEA experts to write down the logic/process of how to do FMEA. Avoid monopoly. Develop more people as qualified facilitators. 63 Appendix E. Participant Input on Information Needs for Dam Safety Evaluation and Management E.1 Summary of Information Needs 1. Establish Evaluation Process a. Protection of life and property b. Develop no risk class c. Develop standards d. Establish guidelines e. Public safety f. Acceptance by public g. Accepted levels of risk 2. Risk Identification a. Use risk to identify problems b. Procedure for quickly and easily classify c. Team approach generates a good evaluation d. FMEA 3. Hazard Classification and Consequences a. Hazard classification b. Define hazard ratings 4. Confidence Level a. Know uncertainties b. Degree of uncertainty c. Credibility verification/confidence building d. Standard/regulations sufficient e. Public trust and reputation 5. General Risk Management Considerations a. Risk management options b. Risks that should be reduced in the short-term c. Risks that should be reduced d. Cost effectiveness e. Tight budget f. Risks associated with all dams around g. Risk is removed h. Prioritization 6. Business/Legal/Political Considerations a. Effect of delays 64 b. Legal and political constraints c. Endangered species d. Business viability e. Not lives if business f. Regulatory considerations g. Business risk h. Legal liability I. Societal risk j. Retention of insurance coverage 7. Risk Analysis a. Common understanding of definitions b. Procedures and practices c. Concept and calculation of loss of life d. Probabilities of extreme events are accurate e. Basics of risk management f. Establish process g. Basic knowledge of risk analysis h. Improved tools 65 E.2 Notes on Information Needs Information needs for dam safety evaluation and management What: (Name of a need) 1. Establish Evaluation Process Risk Acceptance Criteria Who: (Needs this) Decision makers (owners), regulators, and public (to know there is a process). Why/When: (Do they need it) To set the framework for the rest of the process Beginning—a set of expectations. Where will it be used: (In-house, public meetings) In making the decisions on the dam. How will it be used: Risk will be compared with the expectations. 66 Information needs for dam safety evaluation and management What: (Name of a need) 2. Procedures for accomplishing risk identification Who: (Needs this) 75,000 25,000 The majority of dam owners and engineers who do their evaluation (if any) and regulators. Why/When: (Do they need it) For inspection/evaluation/monitoring for public safety Money being spent in right places Yesterday/ASAP Where will it be used: (In-house, public meetings) By regulators By owners/engineers How will it be used: To identify dam safety actions -monitoring -investigating -inspections -analyzing/evaluation -modifications/improvements -prioritization -getting funding or assistance 67 Information needs for dam safety evaluation and management What: (Name of a need) 3. Hazard classification and consequences A list of considerations: Traditional Issues: Height Volume People Property Modern Issues: Social effects Environment Political Legal Who: (Needs this) Owners Engineers Regulators Government (decision makers, politicians) Public Why/When: (Do they need it) They need it today. (When) Need continuous updating. They need it to understand the hazard that the dam is posing. (Why) Where will it be used: (In-house, public meetings) It will be used wherever it is necessary to inform recipient of dam hazard, both individually and relatively (portfolio). How will it be used: 4) Set priorities 5) Maintain awareness 68 Information needs for dam safety evaluation and management What: (Name of a need) 4. Confidence Level Who: (Needs this) Regulators, legislators, public, owners, engineers (stakeholders). Why/When: (Do they need it) Decision time. Where will it be used: (In-house, public meetings) Need to understand the variability from an absolute answer in the decision process (credibility). How will it be used: To make informal and accepted decisions (uncertainty analysis). 69 Information needs for dam safety evaluation and management What: (Name of a need) 5. Decision-making for Risk Management/Risk Reduction Who: (Needs this) Owners, regulators, decision-makers, technical advisers to decisions. Why/When: (Do they need it) Sequence, timing, and extent of risk reduction actions and justification of proposed plan. Where will it be used: (In-house, public meetings) In-house, public meetings. How will it be used: Use risk-based information to make decisions. 70 Information needs for dam safety evaluation and management What: (Name of a need) 6. Business criteria/legal framework & Political realities-risk perception Who: (Needs this) Owner Public-lawmakers Planners-developers Engineer knowing the business parameters Insurance industry Private persons - liability issues - environmental Why/When: (Do they need it) Why - regulatory issues; protection of private and public assets When - design-planning phase; operation phase; decommissioning phase Where will it be used: (In-house, public meetings) Same as Why/When Public policy bodies Business policy bodies How will it be used: Risk management decisions at each phase of the life cycle 71 Information needs for dam safety evaluation and management What: (Name of a need) 7. Understanding the meaning of probabilities in general Who: (Needs this) All interpreting probabilities Why/When: (Do they need it) Before starting a RA Where will it be used: (In-house, public meetings) Yes (in-house, public meetings). How will it be used: To understand the meaning of a probability estimate 72 Information needs for dam safety evaluation and management What: (Name of a need) 7. Reliable and acceptable methods for determining probability and extent of failure Who: (Needs this) 1. Engineers 2. Regulators 3. Others Why/When: (Do they need it) Why - To get reliable, consistent, and defensible answers (legally defensible) Where will it be used: (In-house, public meetings) In the process of carrying out R/A and in presenting it to others How will it be used: Evaluating safety of dams 73 Appendix F. Participant Input on Failure Modes Identification (Qualitative Approaches) F.1 Strengths Identification of failure mechanisms otherwise been over looked. Identify alternative failure modes. Increase understanding of dam Bring in Electrical, Mechanical, Environmental views. Develop transitions between specialists and engineering consultants Help initial prioritization of issues. Identify uncertainties. Start public involvement. Identify new data needs. Identifies failure modes. Team approach provides variety of viewpoints. Failure mode identification. Can use to evaluate and synthesize various aspects of dam safety program. Can use as QA tool to evaluate remedial design. Strengthens the diligence. Can piggyback on periodic design review. Prioritize Risk. Apples to apples. Less data requirements. Quicker to complete. Considers factors that are difficult to quantify. Can get by off by staff. Can get buy in of regulator/Dam safety decision makers efficient. Provides a supplement to standards based. Simple. Helps with surveillance. Identifies all failure modes. Gives crude identification of critical failure modes. Raises awareness of issues with management. Failure mode identification is 1/2 value of RA vs. deterministic thinking. Identifies simple, cost-effective risk reduction measures. Quick. Broad. Some useful information provided. Helps identify all failure modes. Identifies unusual failure modes—the oddball failure mode. More people buy into the process. Helps you think more broadly. Simple. Identification of risk otherwise not noted. Better than no RA at all. Easily done. Wide acceptability. Valuable information. Organized focus on failure modes. FMEA more likely to identify potential failure modes. 74 Identified dam’s weak link(s). A lot of information with little effort. Comprehensive. Systematic. Brings balance to Dam Safety programs. Brings insight and understanding. Broad-based more likely to have acceptance in standards-based community. Improved understanding of strengths and vulnerabilities of dam. Identifies safety issues beyond standards based. Teach approach. The concepts better understood. Involves more individuals. Relatively simple. Identifies failure modes quickly. Process encourages discovery of all failure modes. Makes use of available materials (studies). Helps to prioritize fixes. It is a start. Provides something to react to. F.2 Limitations Repeatability. Reliability. Biases. Not much relative ranking provided. Not much existing direction on ”how to” available. May be difficult to dams with little background information. Resource limitation of organization (staffing). Still lacks quantification in making a choice of what’s most important. Does not quantify risk. Ultimately requires decisions on basis of old standards. Lack of quantification. Affected by experience of the team. Magnitude of risks from various sources hard to compare. Indicator only. Not quantifier. More difficult to portray confidence level. Personalities within the team. Defensibility. Repeatability. Based on opinion. No “standard” of good practice. Not fool proof. Difficult to compare importance (risk) from each failure mode. Difficult to compare dams. Not acceptable criteria. Procedure may not be consistent from team to team. Does not provide a measure of risk. Does not reveal relative risks as required by dam safety decisions. Lack of universally approved methodology. Limited use to small dam owners (i.e., cannot afford the process). 75 Are the regulations met? Uses the word “failure.” Parameter uncertainty not included. Dollar cost of process may limit application. No quantification. Lack of accepted standards. Not a public oriented process. Subjective—may not be repeatable. Limited by efforts allocated and composition of team. Lack of prioritization. Reliant on “judgment” to exclude. Team affected by “Group Think.” Limited data extrapolations (i.e., failure modes, static, gates, filter, drains, structures ..). F.3 What are research and development needs? Identify skills required to identify failure modes. Build database of case histories. Include curriculum in schools in failure modes. Analyze data from NPDP on failures and repair. Communicate best practices to others. F.4 How can Qualitative Approaches be Improved? List all elements of dam system (includes foundations, slopes, abutments, etc.) How can each element fail to function as intended? What is effect? Exclude likelihood of outset—list all conceivable modes. Address likelihood as a second step. Include 2-3 experienced failure mode thinkers. Reduce bias by assuming failure, than looking for possible reasons. Need to involve the operators. Develop generic list of failure modes. Collect/summarize failure/accident data for main failure modes and disseminate the data (as much as which did not fail despite starting to) Think like ECK. Big picture vs. small view. Persons must see the whole picture to predict most likely failure mode. Digital view vs. analog view. Focus on benefits not just difficulties. Process needs to be molded into dam inspections. Provide process that is scalable to range of available resources. Include details of effects of methodologies and technical knowledge with their effects on the process. Learn by doing. Develop skills through case history studies. Look at dams with failure scenarios developing conditions in mind. Review of only failure of accident (i.e., NASA), can lead to insight in how they happen so they can be prevented. Imagine failure in hypothetical hindsight. Examine dams with failure modes in mind. Focus on asking the failure mode questions. 76 Appendix G. Participant Input on Portfolio and Index Approaches (Prioritization and Portfolio Approaches) G.1 Strengths Able to identify relative needs for repairs. Based on existing data. Dam safety sooner. Do most in shortest time with least resources. Identify priority for risk reduction measures. Provide some level of justification for proceeding with/deferring fixes. Helpful to owners with a new dam safety program. Non-judgmental between dams. Help with obtaining funding. Builds consensus on priorities. Common currency across owners, dams elements, failure-modes. Rational basis for priorities. Paints picture liabilities. Allows comparison. Input to decision-making. Provides better picture of the dam system. Site to site comparisons possible. Provides insight into sensible strategies. Provides true measure of risk. Coordination of engineering issues with business needs, objectives and priorities. Integration of all aspects of dam safety program. Flexible—can be adjusted to desired level of detail. Logical, defensible prioritization of risk. Provides basis for better use of limited funds. Provides means to gain management support. Creates mechanism to improve loss of life criteria economically. Generally defendable for action—no action. More bang for the buck. Gives owner “high level” understanding of risks. Allows rapid and consistent evaluation of portfolio, also cost effective. Quick. Forces judgment. All components of risk can be quantified. Allows priorities without dealing in absolutes. Has room for unknown or unresolved issues. It’s systematic. It’s explainable. Can probably repeat results. Identifies entire scope of dam safety needs. Identifies urgent (quick fix) needs. Allows the maximization of risk reduction for each Dollar. Organized approach to develop relative ranked order of projects with deficiencies. If dam low on priority list fails, provides some defense to regulator. Allows regulator to apply limited resources to project posing most risk. Allows identification of deficiencies (through FMEA), and risk calculation. Prioritizes these in loss of life and financial terms. 77 Economic if done in groups of dams. Gives overall risk profile. Allows prioritization of investigations and monitoring. Compares: 1) dams and performance; 2) criteria; and 3) consequences. Leads to cost-effective further investigations. Can be done based on existing data. Provides prioritization and justification for fixes and investigations. Provides basis for risk reduction program/meets due diligence. Preserves probability metric. Initial identification of dam safety issues. Puts dam safety issues into a form that owner’s decision matters can relate to, especially if they are non technical people. Identifies highest priority projects. G.2 Limitations Are the numbers believable? Evaluation is more broad-brush. Isn’t absolute. Doesn’t say how fast. Based on existing data. Using the results beyond intentions. Variation among different systems. Too great a variability in risk numbers. Defensibility sometimes questionable. Is it practical other than for owners of large numbers of dams? Limited number of experienced and qualified facilitators. May provide a false sense of security. “Broad brush” may not reveal all-important vulnerabilities. Less useful for small owners with few dams. Can provide excuse not to proceed with detailed assessments. Priorities may change. Identifies deficiency. Does not force fix. Negligence? Can mislead. Can be misused. Beyond defensibility. Do we have to spell prioritization with an “S”?’ No published standards for performing. Incorrect existing data could lead to incorrect conclusions. Difficult to communicate limitations. May be too crude. High probability failure modes may not receive proper consideration. Limited by easily available data and analysis—probability not constant across inventory. May be superficial. How to deal with dams with a lot of information vs. those with little or no information. Costs. Index approaches not true utilization of risk assessment and FMEA. Only gives owner “high level” understanding of risks. Prioritization means some things are not done. Owners and engineers start to believe the risks absolutely and want to sign off without detailed RA and detailed engineering. 78 Not clear in regard to uncertainty. It is seldom quantified. Some techniques for estimation of, e.g., consequences, are limited accuracy. May mis-prioritize. No accepted approach for consistent application. Index methods may not preserve probability metric and therefore may distort priorities. Does not maximize rate of cost-effectiveness. No sign off. Risk criteria evaluations may be assumed to be final. Haste may miss important failure mode. May be too costly for small owners. Must keep uncertainties in inputs on the screen for decision makers. G.3 How Can It Be Improved? Leave it alone and don’t mess it up. More defendable relationships between ranking variables. Develop process standards for some level of consistency. Develop procedure or guideline by having a general documentation of PRA methods. Need tier system so we can meet owner resource availability. Prepare consensus statement on uses and limitations. Use high-level review panel for key inputs to portfolio RA. Develop and make available portfolio software. G.4 How Can Others Use it (Technology Transfer and Training Needs)? Seems that transfer must be done one to one coaching. Develop guidelines for what constitutes a portfolio assessment and how it may be done. Sponsor seminars aimed at educating non-technical staff among owners. More experience by more people. By sharing experience on PRA with others on how well the process worked and what should be changed. Demonstration projects. Train more facilitators. Publish complete portfolio risk assessment case study(ies) as a general study(ies) include strength and weaknesses. ASDSO could compile risk indexing and prioritization approach and provide summary to states. G.5 What are R&D Needs? Debate underlying concept . consensus concept. Debate mechanics . consensus on mechanics. Most state dam safety programs have no program for profiling and prioritization. Consider developing index system that state dam safety programs could use for profiling dams that they regulate. Improve confidence of loss of life estimates. Develop guidelines for prioritization and portfolio approach. “Learn how to improve” PRA by evaluating changes resulting from updating. Develop simple easy to use approach that will gain general acceptance. More input from users on information needs. Check USBR index system against portfolio method and try to assess how effective it is good for state officials. 79 1. Are rating points system worth doing without FMEA procedure, there is a chance of missing the critical issue? 2. Can portfolio assessment be used for prioritization of known deficiencies (e.g., as opposed to USBR prioritization)? 3. Can a prioritization index system be consistent with a risk analysis approach? 80 Appendix H. Participant Input on Quantitative Approaches H.1 Strengths Regulator imposed requirements are more fair for various dam owners. Common basis for comparing risks between various hazards and between dams. Supports need for remedial measures identified by traditional approach. Identifies and quantifies deficiencies that were previously unrecognized. Much greater insight into the mechanics of failure. Provides a very useful tool for dam safety upgrade decision-making. Methods for estimation of probabilities of failure are mostly based on traditional eng. inferring methods of analysis. Makes process of engineering judgment more transparent. Gives owner, regulator a better idea of what risk a dam poses. Removes some ambiguity. Answers question of how bad/how good. Allows comparison with acceptance criteria, and more accurate assessments of what drives the risk. Allows explicit representation of uncertainties. A more balanced assessment of risks from “normal” conditions and extreme events. Assessment of relative risks of different failure modes. Systematic consideration of dam safety – all aspects. More “bang for the buck” in selecting preferred rehabilitation alternatives. Group thinking and group input. Provides insights into most critical factors affecting early failure mode and therefore most effective ways to reduce probability of failure. More in depth analyses typically performed. More defensible. More illuminating. Better treatment of uncertainty. Allows best “dissection” of failure mode. Compare between failure modes is good. Can (should) include explicit consideration of uncertainty. Careful consideration of steps leading to failure. Helps owner understand his/her exposure. If well done, focuses on owner’s information needs, not just engineering issues. Creates a measurable approach for comparison. Detailed discussions of factors affecting events leading to failure. Good tool for managing risk across a large portfolio of dams. Provides insights into relative risk (probability and consequences) of failure. Group judgments can outperform individuals (some times). Reveals relative importance of particular features, conditions, and actions. Quantifies relative importance of failure mechanisms. Identifies where further info/investigations/analyses most useful and beneficial. Shows decision makers why things are important. Allows state of knowledge/ignorance to be expressed. Puts complex engineering issues into a form (common risk currency) that often convinces lay decision makers to a more than traditional engineering only approach. Allows for failure mode decomposition. 81 H.2 Limitations Hydrologic — Not much available on parameter variation (uncertainty) determination analyses. Structural — (Concrete and earth). Not much available on parameter variation (uncertainty) determination analyses. Uncertainties used appear to be subjective and not objective. To date limited input from outside dam safety comm. (i.e., little general public input). Results can be heavily affected by knowledge and experience of team. Methods for estimating probability of failure by piping, earthquake on concrete dams, and stability of embankments dam need development. We need to develop methods for conveying uncertainty in answer and in “acceptance” criteria (to avoid the point and line approach). Who dictates acceptable risk? Engineer Public Politicians Courts. Does not resolve the “acceptable” risk quandary. Methodologies require much more development. Costly at present. Lack of acceptance for life safety criteria. Difficulties in communicating risks to owners, others. Many pitfalls in performing the risk calculations, making probability estimates, and post processing. Very difficult to make probability assessments for events with very limited historical case histories. May be prohibitively expensive and time consuming if not done under ‘expert’ supervision. Criteria may put too much emphasis on EAP for loss of life reduction. Procedure is not standardized. Results between evaluators are not generally consistent. Believing numbers/results without understanding the uncertainties. Uncertainties in resulting numbers. Possible misuse of resulting numbers. Lack of people experienced and qualified to estimate probabilities. Possible bias of existing dam risk assessment practitioners. Process can be dominated by a few individuals. Probability of failure estimates not fully defensible. Experienced engineer needed—they are dwindling. Can be high cost. Needs to be toned down to recognizable terms for acceptance to general dam safety community. Probabilities of extreme events/loading not readily available for much of U.S. Too complex and time consuming for most state regulated dams. Insufficient data to estimate probabilities with confidence. Cost. Difficult for dams that present no symptoms. Yet to account for all human reasoning and judgment processes. Lack of risk tolerance limits established for broad applications. Can imply more knowledge than there is, if improperly presented or quoted. Requires experienced, broadly trained professionals (rare), with previous exposure to all facts of dam engineering. Danger of believing the numbers. Subjective results are made to appear objective. Focus on engineering wants rather than owner needs. 82 Lack of benchmarks. How to compare RA site A to B to C. Terminology. No widely accepted loss of life criteria are available. Methods for estimating loss of life totally inadequate—much worse than those for estimating probability of failure. Tolerable risk criteria difficult or impossible to establish. H.3 Technology Transfer and Training Needs Limited probability training for engineers Demonstration projects Need to document detailed QRA method state-of-practice and run training workshops Need bulletin of R/A for dams that assembles all case histories et. al. Produce a life safety discussion paper, exhibit publicly and invite submissions Dam safety community should interact with DOE, NRC on QRA Training in basic skills such as understanding probability & expert elicitation Can you generalize information or "Education" from stochastic H.4 Research and Development Needs CARDS SUBMITTED IN THIS CATEGORY WERE COMBINED INTO OVERALL R&D NEEDS (SEE APPENDIX J) BEFORE THEY COULD BE RECORDED SEPARATELY H.5 How Can it Be Improved? Maintain separate pairs of probability consequences where the probability speaks directly to the consequence. Just do it. Examples developed noting uncertainty inclusion. OTHER CARDS SUBMITTED IN THIS CATEGORY WERE COMBINED INTO OVERALL R&D NEEDS (SEE APPENDIX J) BEFORE THEY COULD BE RECORDED SEPARATELY 83 Appendix I. Sorted Participant Input on Strengths and Limitations of the State of the Practice I.1 Failure Modes Identification I.1.1 Strengths 1 Failure modes paradigm Identification of failure mechanisms otherwise been over looked. Identify alternative failure modes. Identifies failure modes. Failure mode identification. Identifies all failure modes. Gives crude identification of critical failure modes. Helps identify all failure modes. Identifies unusual failure modes-the oddball failure mode. Identification of risk otherwise not noted. Organized focus on failure modes. FMEA more likely to identify potential failure modes. Identified dam's weak link(s). Improved understanding of strengths and vulnerabilities of dam. Identifies safety issues beyond standards based. Process encourages discovery of all failure modes 2 Relatively low effort Can piggyback on periodic design review. Less data requirements. Quicker to complete. Considers factors that are difficult to quantify. Simple. Failure mode identification is 1/2 value of RA vs. deterministic thinking. Quick. Simple. Better than no RA at all. Easily done. A lot of information with little effort. Relatively simple. Identifies failure modes quickly. It is a start. 3 Broad interdisciplinary team approach Bring in Electrical, Mechanical, Environmental views. Develop transitions between specialists engineering consultant and ?? Team approach provides variety of viewpoints. Broad. Helps you think more broadly. Comprehensive. Involves more individuals. Makes use of available materials (studies). 84 4 Enhances understanding Increase understanding of dam Some useful information provided. Valuable information. Brings insight and understanding. The concepts better understood. Provides something to react to. 5 Wide acceptability Can get buy in of staff. Can get buy in of regulator/Dam safety decision makers efficient. More people buy into the process. Wide acceptability. Broad-based more likely to have acceptance in standards-based community. 6 Strengthens traditional approach/Quality Assurance Can use as QA tool to evaluate remedial design. Strengthens the diligence. Provides a supplement to standards based. Brings balance to Dam Safety programs. 7 Identifying additional information needs Identify uncertainties. Identify new data needs. Helps with surveillance. 8 Aids in prioritization of issues Help initial prioritization of issues. Prioritize Risk. Helps to prioritize fixes. 9 Aids in communicating risks Start public involvement. Raises awareness of issues with management. 10 Tool for achieving integration of dam safety program Can use to evaluate and synthesize various aspects of dam safety program. 11 Aids in identification of risk reduction measures Identifies simple, cost-effective risk reduction measures. 85 12 Systematic approach Systematic I.1.2 Limitations 1 Qualitative - risk, ranking, compare with other dams, confidence/uncertainty Not much relative ranking provided. Still lacks quantification in making a choice of what's most important. Does not quantify risk. Ultimately requires decisions on basis of old standards. Lack of quantification. Magnitude of risks from various sources hard to compare. Indicator only. Not quantifier. More difficult to portray confidence level. Difficult to compare importance (risk) from each failure mode. Difficult to compare dams. Does not provide a measure of risk. Does not reveal relative risks as required by dam safety decisions. Are the regulations met? Parameter uncertainty not included. No quantification. Limited by efforts allocated and composition of team. Lack of prioritization. 2 Repeatability, consistency, influence of team members Repeatability. Reliability. Biases. Affected by experience of the team. Personalities within the team. Defensibility. Repeatability. Based on opinion. Not fool proof. Procedure may not be consistent from team to team. Subjective-may not be repeatable. Limited by efforts allocated and composition of team. Reliant on "judgment" to exclude. Team affected by "Group Think." 3 Lack of available guidance Not much existing direction on "how to" available. No "standard" of good practice. Not acceptable criteria. Lack of universally approved methodology. Lack of accepted standards. 86 4 Cost Resource limitation of organization (staffing). Limited use to small dam owners (i.e., cannot afford the process). Dollar cost of process may limit application. 5 Limited case histories to use as basis for FM identification Limited data extrapolations (i.e., failure modes, static, gates, filter, drains, structures ..). 6 Not a public-oriented process Not a public oriented process. 7 Requires information on dam May be difficult to dams with little background information. I.2 Index Prioritization I.2.1 Strengths 1 Prioritization Able to identify relative needs for repairs. Non-judgmental between dams. Allows comparison. Site to site comparisons possible. Allows priorities without dealing in absolutes. Identifies urgent (quick fix) needs. Organized approach to develop relative ranked order of projects with deficiencies. Allows prioritization of investigations and monitoring. Identifies highest priority projects. 2 Efficient process Allows regulator to apply limited resources to project posing most risk. Flexible-can be adjusted to desired level of detail. Economic if done in groups of dams. Based on existing data. Allows rapid and consistent evaluation of portfolio, also cost effective. Quick. Can be done based on existing data. 3 Defensibility Provide some level of justification for proceeding with/deferring fixes. Generally defendable for action - deferred/screening - no action. If dam low on priority list fails, provides some defense to regulator. Provides basis for risk reduction program/meets due diligence. 87 4 Justification Help with obtaining funding. Builds consensus on priorities. Rational basis for priorities. Provides means to gain management support. 5 Communication Input to decision-making. Gives owner "high level" understanding of risks. It's explainable. Puts dam safety issues into a form that owner's decision matters can relate to, especially if they are non-technical people. 6 Systematic process Forces judgment. It's systematic. Can probably repeat results. 7 Identification of dam safety issues Allows identification of deficiencies (through FMEA), and risk calculation. Initial identification of dam safety issues. 8 Integrates dam safety program and into overall business Helpful to owners with a new dam safety program. I.2.2 Limitations 1 Danger of misusing results Doesn't say how fast. Using the results beyond intentions. May provide a false sense of security. Can provide excuse not to proceed with detailed assessments. Priorities may change. Identifies deficiency. Does not force fix. Negligence? Can mislead. Can be misused. Difficult to communicate limitations. How to deal with dams with a lot of information vs. those with little or no information. Prioritization means some things are not done. Must keep uncertainties in inputs on the screen for decision makers. High probability failure modes may not receive proper consideration. 2 Not in-depth risk analysis 88 Are the numbers believable? Based on existing data. Incorrect existing data could lead to incorrect conclusions. May be too crude. Not clear in regard to uncertainty. May mis-prioritize. Haste may miss important failure mode. Evaluation is more broad-brush. Broad brush may not reveal all-important vulnerabilities. Limited by easily available data and analysis-probability not constant across inventory. May be superficial. 3 Lack of published guidance Variation among different systems. No published standards for performing. No accepted approach for consistent application. 4 Relative rather than absolute Isn't absolute. It is seldom quantified. Does not maximize rate of cost-effectiveness. 5 Defensibility Defensibility sometimes questionable. Beyond defensibility. 6 Risk metric Index approaches not true utilization of risk assessment and FMEA. Index methods may not preserve probability metric and therefore may distort priorities. 7 No sign off No sign off. I.3 Portfolio Risk Assessment I.3.1 Strengths 1 Prioritization Able to identify relative needs for repairs. Non-judgmental between dams. Allows comparison. Site to site comparisons possible. Allows priorities without dealing in absolutes. Identifies urgent (quick fix) needs. 89 Organized approach to develop relative ranked order of projects with deficiencies. Allows prioritization of investigations and monitoring. Identifies highest priority projects. Identify priority for risk reduction measures. Prioritizes these in loss of life and financial terms. Provides prioritization and justification for fixes and investigations. 2 Cost effectiveness risk reduction program Dam safety sooner. Do most in shortest time with least resources. Provides basis for better use of limited funds. Creates mechanism to improve/reduce loss of life criteria consequences economically. More bang for the buck. Allows the maximization of risk reduction for each Dollar. Leads to cost-effective further investigations. 3 Justification Help with obtaining funding. Builds consensus on priorities. Rational basis for priorities. Provides means to gain management support. Provides insight into sensible strategies. Provides prioritization and justification for fixes and investigations. Provides basis for risk reduction program/meets due diligence. 4 Communication Input to decision-making. Gives owner "high level" understanding of risks. It's explainable. Puts dam safety issues into a form that owner's decision matters can relate to, especially if they are non-technical people. Paints picture liabilities. Provides better picture of the dam system. Gives overall risk profile. 5 Defensibility Provide some level of justification for proceeding with/deferring fixes. Generally defendable for action - deferred/screening - no action. If dam low on priority list fails, provides some defense to regulator. Provides basis for risk reduction program/meets due diligence. Logical, defensible prioritization of risk. 6 Risk metric Common currency across owners, dams elements (e.g. penstocks vs canals etc.), failure-modes. Provides true measure of risk. All components of risk can be quantified. 90 Compares: 1) dams and performance; 2) criteria; and 3) consequences. Preserves probability metric. 7 Efficient process Allows regulator to apply limited resources to project posing most risk. Flexible - can be adjusted to desired level of detail. Economic if done in groups of dams. 8 Identification of dam safety issues Allows identification of deficiencies (through FMEA), and risk calculation. Initial identification of dam safety issues. 9 Integrates dam safety program and into overall business Coordination of engineering issues with business needs, objectives and priorities. Integration of all aspects of dam safety program. Identifies entire scope of dam safety needs. 10 Systematic process Forces judgment. It's systematic. Has room for unknown or unresolved issues. I.3.2 Limitations 1 Danger of misusing results Doesn't say how fast. Using the results beyond intentions. May provide a false sense of security. Can provide excuse not to proceed with detailed assessments. Priorities may change. Identifies deficiency. Does not force fix. Negligence? Can mislead. Can be misused. Difficult to communicate limitations. How to deal with dams with a lot of information vs. those with little or no information. Prioritization means some things are not done. Must keep uncertainties in inputs on the screen for decision makers. Only gives owner "high level" understanding of risks. Owners and engineers start to believe the risks absolutely and want to sign off without detailed RA and detailed engineering. Risk criteria evaluations may be assumed to be final. 2 Not in-depth risk analysis Are the numbers believable? 91 Based on existing data. Incorrect existing data could lead to incorrect conclusions. May be too crude. Not clear in regard to uncertainty. May mis-prioritize. Haste may miss important failure mode. Too great a variability in risk numbers. Some techniques for estimation of, e.g., consequences, are limited accuracy. 3 Cost Is it practical other than for owners of large numbers of dams? Less useful for small owners with few dams. Costs. May be too costly for small owners. 4 Lack of published guidance Variation among different systems. No published standards for performing. No accepted approach for consistent application. Limited number of experienced and qualified facilitators. 5 Defensibility Defensibility sometimes questionable. 6 No sign off No sign off. 7 Relative rather than absolute Isn't absolute. I.4 Detailed Quantitative Risk Assessment I.4.1 Strengths 1 Valuable as a decision tool Regulator imposed requirements are more fair for various dam owners. Supports need for remedial measures identified by traditional approach. Provides a very useful tool for dam safety upgrade decision-making. More "bang for the buck" in selecting preferred rehabilitation alternatives. Provides insights into most critical factors affecting early failure mode and therefore most effective ways to reduce probability of failure. Helps owner understand his/her exposure. If well done, focuses on owner's information needs, not just engineering issues. Good tool for managing risk across a large portfolio of dams. 92 Identifies where further info/investigations/analyses most useful and beneficial. Shows decision makers why things are important. Puts complex engineering issues into a form (common risk currency) that often convinces lay decision makers to a more than traditional engineering only approach. 2 Quantification using risk metric Common basis for comparing risks between various hazards and between dams. Identifies and quantifies deficiencies that were previously unrecognized. A more balanced assessment of risks from "normal" conditions and extreme events. Assessment of relative risks of different failure modes. Systematic consideration of dam safety - all aspects. Compare between failure modes is good. Creates a measurable approach for comparison. Provides insights into relative risk (probability and consequences) of failure. Reveals relative importance of particular features, conditions, and actions. Quantifies relative importance of failure mechanisms. Allows for failure mode decomposition. 3 Understanding of failure modes Much greater insight into the mechanics of failure. Gives owner, regulator a better idea of what risk a dam poses. Removes some ambiguity. Answers question of how bad/how good. Allows comparison with acceptance criteria, and more accurate assessments of what drives the risk. Provides insights into most critical factors affecting early failure mode and therefore most effective ways to reduce probability of failure. More illuminating. Allows best "dissection" of failure mode. Careful consideration of steps leading to failure. Detailed discussions of factors affecting events leading to failure. 4 Uncertainties considered Allows explicit representation of uncertainties. Better treatment of uncertainty. Can (should) include explicit consideration of uncertainty. Allows state of knowledge/ignorance to be expressed. 5 In-depth supporting analyses Methods for estimation of probabilities of failure are mostly based on traditional eng. inferring methods of analysis. More in depth analyses typically performed. 6 Team process Group thinking and group input. Group judgments can outperform individuals (some times). 93 7 Defensibility More defensible. 8 Risk criteria evaluation Allows comparison with acceptance criteria, and more accurate assessments of what drives the risk. 9 Transparency in engineering judgments Makes process of engineering judgment more transparent. I.4.2 Limitations 1 Lack of standardized procedure and experienced practitioners To date limited input from outside dam safety comm. (i.e., little general public input). Results can be heavily affected by knowledge and experience of team. Methods for estimating probability of failure by piping, earthquake on concrete dams, and stability of embankments dam need development. Methodologies require much more development. Many pitfalls in performing the risk calculations, making probability estimates, and post processing. Procedure is not standardized. Results between evaluators are not generally consistent. Lack of people experienced and qualified to estimate probabilities. Possible bias of existing dam risk assessment practitioners. Process can be dominated by a few individuals. Experienced engineers needed - they are dwindling. Requires experienced, broadly trained professionals (rare), with previous exposure to all facts of dam engineering. 2 Acceptable/tolerable risk criteria not agreed To date limited input from outside dam safety comm. (i.e., little general public input). We need to develop methods for conveying uncertainty in answer and in "acceptance" criteria (to avoid the point and line approach). Who dictates acceptable risk? Engineer, Public, Politicians, Courts Does not resolve the "acceptable" risk quandary. Lack of acceptance for life safety criteria. Criteria may put too much emphasis on EAP for loss of life reduction. Lack of risk tolerance limits established for broad applications. Focus on engineering wants rater than owner needs. Lack of benchmarks. How to compare RA site A to B to C. No widely accepted loss of life criteria are available. 3 Uncertainty in estimating probabilities and life loss Hydrologic - Not much available on parameter variation (uncertainty) determination analyses. 94 Structural - (Concrete and earth). Not much available on parameter variation (uncertainty) determination analyses. Uncertainties used appear to be subjective and not objective. Very difficult to make probability assessments for events with very limited historical case histories. Probability of failure estimates not fully defensible. Probabilities of extreme events/loading not readily available for much of U.S. Insufficient data to estimate probabilities with confidence. Difficult for dams that present no symptoms. Yet to account for all human reasoning and judgment processes. Methods for estimating loss of life totally inadequate-much worse than those for estimating probability of failure. 4 Communicating uncertainties to decision makers and others We need to develop methods for conveying uncertainty in answer and in "acceptance" criteria (to avoid the point and line approach). Difficulties in communicating risks to owners, others. Believing numbers/results without understanding the uncertainties. Uncertainties in resulting numbers. Possible misuse of resulting numbers. Can imply more knowledge than there is, if improperly presented or quoted. Danger of believing the numbers. Subjective results are made to appear objective. 5 Cost Costly at present. May be prohibitively expensive and time consuming if not done under 'expert' supervision. Can be high cost. Too complex and time consuming for most state regulated dams. Cost. 6 New and complex terminology Needs to be toned down to recognizable terms for acceptance to general dam safety community. Too complex and time consuming for most state regulated dams. Terminology. 95 Appendix J. Participant Voting on Technology Transfer and Training Needs Failure Modes Identification Approaches Issues Votes Failure mode thinking- (Documented case histories; Training seminars; Hands-on 21 workshops; Systematic approach [list elements & ask how can fail]) Build FMEA into standards based reviews - economy of resources 18 Tools for owners with limited resources 11 Regular program for operator training 5 RAC could share some of their failure mode spreadsheets with the rest of us 4 Someone (FEMA/ASDSO/ICODS?) should develop a 'methodology' that tries to 2 standardize the process Develop more people as qualified facilitators 2 Review case histories 1 How can RA benefit owners of 1 or a few dams? 1 Do dams in groups with same experts 0 Need ways to get limited expertise applied more broadly 0 Avoid monopoly 0 Documented reports of use 0 Get small group of experienced FMEA experts to write down the logic/process of how 0 to do FMEA Focus on integration with existing efforts 0 Index Prioritization and Portfolio Risk Assessment Approaches Issues Votes Develop guidelines for what constitutes a Portfolio Assessment and how it may be done 25 Publish complete Portfolio Risk Assessment case study (s) as a generic study (s) include strength & weaknesses 9 ASDSO could compile risk indexing and prioritization approaches & provide summary to states 7 By sharing experience on PRA with others on how well the process worked & what should be changed 5 More experience by more people 4 Need tier system so we can meet owner resource availability 3 Demonstration projects 3 Seems that transfer must be done [through] one to one coaching 0 Sponsor seminars aimed at educating non-technical staff among owners 0 Train more facilitators 0 96 Detailed Quantitative Approaches Issues Votes Limited probability training for engineers 22 Demonstration projects 8 Need to document detailed QRA method state-of-practice and run training workshops 5 Need bulletin of R/A for dams that assembles all case histories et. al. 5 Produce a life safety discussion paper, exhibit publicly and invite submissions 5 Dam safety community should interact with DOE, NRC on QRA 3 Training in basic skills such as understanding probability & expert elicitation 1 Can you generalize information or "Education" from stochastic 0 97 Appendix K. Participant Input on Research and Development Needs Categories 1 -Standards (A) • All Civil Engineering is empirical, therefore, it is risk based! FS = 1.5 means low risk, not zero risk. • How do the new computer tool encroach on FS in standards based designs and how does this change 100 yr database? • ~ 1 in 100 dams fail • How do we change standards without addressing risk? • Dams with no possibility of life loss • Large dams that must meet PMF & MCE • What is a “reasonable FS”? Is the MCE adequately conservative? • Parallel risk assessments of the same dam • Incentive/need to undertake risk assessment if dams meet standards regulations • Is a standards approach a zero risk approach? • Subjective probabilities bad for quantitative RA but OK for standards? • Standards .restrictive thinking • Failure mode should always be considered • Missing failure modes • Also a problem with defensibility of standards • How do engineering/subjective judgments affect traditional approach outcomes vs. risk-based approach outcomes? • Risk seems to add to short comings of standards approach as opposed to avoid (parameter uncertainty analysis) • New dams vs. existing dams 2, 6 -Tolerable Risk/Criteria (B) • Is legislative intent to get to zero risk to life? • Public aversion or intolerance to imposed risks • Who decided “RP” in ALARP? • Who decides what is tolerable risk for dams? • Tolerable risk criteria as an interim step on the constant path of risk reduction • State legislation says, “remove the risk”; implies that there could be zero risk. Not possible. • Accepted level of risk is a organization to organization, case by case aspect • Regulators need to educate government that “safe” means a low probability of failure, not "no chance" of failure. • The FEMA requirements are impossible if followed rigidly • Legislators, not the Regulators should determine risk level accepted. • Dams are only one piece of society’s risk pool. • Strive for consistent risk. • Is it reasonable to rely on warnings and evacuation as a risk reduction measure? • Engineers + Lawyers = inferior dam safety decisions • Who could go to jail if the dam fails? • Acceptance of loss of life? • EAP vs. fixing dam • The public is extremely risk adverse about dams. How can you get acceptance of risk levels given that? 98 • Get out of jail free card Criteria: • If risk is the owner’s what does this mean for non-owner beneficiaries to share risk? • <1 lives/yr does not communicate with the public. Why aren’t we looking at calculating the probability of one or more lives lost by a particular event, then ask what the acceptable probability for public would be • How do we get public input for risk criteria/public protection guidelines? • EAPs not a substitute for structural fix • What to do if repair may pose more risk than existing conditions (no-fix)? • Who will (should) establish life safety criteria? Is it practical for them to do so? • Obtain public & political input to debate on acceptance limits 3 -Subjective Probability (C) • R & D needs: A. Immediate Develop an improved understanding of probability interpretations and corresponding expectations of those using, interpreting, or considering quantitative methods. Develop better ways for adapting criteria to probability (rather than vice-versa) and operating within its capabilities. B. Intermediate-term Education and training of probability assessors in cognitive processes, heuristics and biases. Development and application of de-biasing techniques adapted in positive ways to how people think and how they conceptualize subjective uncertainty judgments. Education and training in basic probability theory (axioms, etc.) C. Longer-term Improve judgment of probability assessor What is judgment? How does substantive expertise differ from normative expertise? Role of inductive vs. deductive reasoning strategies How is judgment enhanced? • Adapt and merge ongoing R&D from institutions, e.g. Stanford University regarding human thought processes (R&D card) • What is the value to the public of subjective probability estimates? (issue card) • Dam response probability subjective estimate divergence theory: If team thinks failure mode is a problem based on discussion, then the subjective value is higher. If team thinks failure mode is not a problem then subjective estimate is lower. (issue card) 4 -Skills to Identify Failure Modes (D) • Change paradigm for quantitative risk analysis 5 -Uncertainty (E) • Effects of distributions on event probability estimates. 7, 18, 19 - Prioritization and Portfolio Tools (F) 99 • Develop guidelines for prioritization & portfolio approach • Develop simple, easy to use approach that will gain general acceptance • Most state dam safety programs have no program for profiling & prioritization. Consider developing index system that state dam safety programs could use for profiling dams that they regulate • Check USBR index system against portfolio method and try to assess how effective it is. Good for state officials • Are rating points systems worth doing without FMEA procedure? There is a high chance of missing the critical issue • Can a prioritization index system be consistent with a risk analysis approach? • Can portfolio assessment be used for prioritization of known deficiencies (e.g. as opposed to USBR prioritization) • Improve confidence of loss of life estimates 8 -Earthquake Response (G) • Develop more realistic seismic displacement and liquefaction models. • Develop better methods for structural response of: • Concrete gravity dams in earthquake • Embankment stability • Piping, static and post earthquake • RA is very good where there is no standards based analytical tool e.g. Navaho Drain Tunnel • Factor of safety vs. probability of failure. Need conservative strengths for FS = 1.5 to represent low probability of failure. • Inter-related failure modes • Does number of steps included in event tree fundamentally affect resulting probability? • Length and number effects on probability estimates • Develop capability to derive failure probability analytically • Need to improve understanding/ways to predict system response probabilities • Failure mechanism understanding and modeling • Develop failure models that use probabilistic input both for loads and resistance • Adapt failure models for nodes of event trees 9 -Static Response (H) • Improve estimates of failure probabilities for static stability piping failure, etc. • Research need: • Failure models • Piping models • Loss of life models • How confident are we in characterization of piping failures – embankment, foundation, etc.? • Seepage rate is not a good guide to problems. Changes, not correlating with reservoir is better. • Piping failures take less than 24 hrs, mostly < 6 hrs, to develop. They historically occur at reservoir level about 1m below historic high level. • Develop risk analysis procedures to account for time-dependent aspects of piping. • RA is very good where there is no standards based analytical tool e.g. Navaho Drain Tunnel • Factor of safety vs. probability of failure. Need conservative strengths for FS = 1.5 to represent low probability of failure. • Inter-related failure modes 100 • Does number of steps included in event tree fundamentally affect resulting probability? • Length and number effects on probability estimates • Develop capability to derive failure probability analytically • Need to improve understanding/ways to predict system response probabilities • Failure mechanism understanding and modeling • Develop failure models that use probabilistic input both for loads and resistance • Adapt failure models for nodes of event trees 10, 21 - Improve Loss of Life Estimation (I) • R&D life loss estimation • LOL estimate should consider EAP • Assessment of evacuation capability for large population centers • Develop procedures to assess (understand) effectiveness of EAP/EPP • Existence of EAP in loss of life estimates • Long term effectiveness of warning and evacuation systems • Relationship between life loss & proximity to the dam? 12 -Risk Communication (J) • What do the numbers resulting from QRA really mean? (issue card) • Hazard (seismic), Hazard (downstream): 1) drop both uses, 2) use Seismic loading, 3) use consequence (issue card) • Can we build public confidence in life loss estimates? (issue card) • Owners: be able to defend what you are doing as being reasonable and prudent (issue card) • Need for common language between technical specialists & international (English (geotech) . English (financial) . English (probability) . English (international) . English (seismic) . English (H&H) . English (owner) . English (lawyers). (issue card) • Public buy-in for risk-based decisions (issue card) 13 -Dam Break Failure Case Histories (K) • Case history compilations need to be parameter specific 14 -Earthquake Loading (L) • Reduce uncertainty & minimize compounding of conservatism in seismic risk assessment • Seismic loads need: • Additional data collection – slip rates • Site response data • Recurrence models • Robust estimates of time histories for use in RA • Better integration with engineering analyses • Portray uncertainty in an understandable fashion • 0.2 g for 1/10,000 event what magnitude? • Reduce errors in catalogue of recorded seismic accelerations (data cleaning) • Uncertainties in recurrence characteristics for known faults 101 15 - Flood Loading (M) • Regional analyses of extreme precipitation probabilities for entire U.S. – allows states to estimate % PMP probabilities • Extreme event probability determination improvement • Reduce uncertainty in hydrologic process evaluation • Continued support for development of methods for processing hydrologic info for characterizing extreme floods • Development of procedures for better understanding and incorporating uncertainty in characterization of floods • Comprehensive program for collection of climate flood & paleoflood data on regional basis to support regional analyses • Studies to investigate spatial distribution for large watersheds using probabilistic methods • Confidence in extreme event estimates • Variability in PMF computations uncertainty of parameters 16 -Risk Process (N) • Compare on equal basis judgment & unknowns for Load; Response; Life loss • Uncertainty analysis approaches beginning from probability estimation, failure mode identification through presentation of outcomes to decision makers • Assess repeatability – considering uncertainty ranges (not just point estimates) • How do we reflect uncertainty in perfect history database? • (To John Ake) Do you really do all what you describe for QRA studies, particularly screening level? • How do amount & quality of data affect confidence in RA results? • Long dams; multiple dam reservoirs need probabilistic concepts to be ‘correct’ • Repeatability: (even for qualitative methods) 17 -Analyze NPDP (O) • No cards 20 -Debate Mechanisms (P) • No cards 22 -Communicate Best Practice (Q) • No cards 24 -Include Failure Modes Identification in schools (R) • No cards Portfolio - Learn to improve (S) • Learn how to improve PRA by evaluating changes resulting from updating • More input from users on info needs 102 26 -Debate Concepts (T) • No cards 103