網路城邦
回本城市首頁 時事論壇
市長:胡卜凱  副市長:
加入本城市推薦本城市加入我的最愛訂閱最新文章
udn城市政治社會政治時事【時事論壇】城市/討論區/
討論區知識和議題 字體:
上一個討論主題 回文章列表 下一個討論主題
如何加強風險管理 - ScienceDaily
 瀏覽503|回應0推薦1

胡卜凱
等級:8
留言加入好友
文章推薦人 (1)

胡卜凱

How 'Black Swans' and 'Perfect Storms' Become Lame Excuses for Bad Risk Management

 

ScienceDaily (Nov. 15, 2012) -- Instead of reflecting on the unlikelihood of rare catastrophes after the fact, Stanford risk analysis expert Elisabeth Paté-Cornell prescribes an engineering approach to anticipate them when possible, and to manage them when not.

 

The terms "black swan" and "perfect storm" have become part of public vocabulary for describing disasters ranging from the 2008 meltdown in the financial sector to the terrorist attacks of September 11. But according to Elisabeth Paté-Cornell, a Stanford professor of management science and engineering, people in government and industry are using these terms too liberally in the aftermath of a disaster as an excuse for poor planning.

 

Her research, published in the November issue of the journal Risk Analysis, suggests that other fields could borrow risk analysis strategies from engineering to make better management decisions, even in the case of once-in-a-blue-moon events where statistics are scant, unreliable or non-existent.

 

Paté-Cornell argues that a true "black swan" -- an event that is impossible to imagine because we've known nothing like it in the past -- is extremely rare. The AIDS virus is one of very few examples. But usually, there are important clues and warning signs of emerging hazards (e.g., a new flu virus) that can be monitored to guide quick risk management responses.

 

Similarly, she argues that the risk of a "perfect storm," in which multiple forces join to create a disaster greater than the sum of its parts, can be assessed in a systematic way before the event because even though their conjunctions are rare, the events that compose them -- and all their dependences -- have been observed in the past.

 

"Risk analysis is not about predicting anything before it happens, it's just giving the probability of various scenarios," she said. She argues that systematically exploring those scenarios can help companies and regulators make smarter decisions before an event in the face of uncertainty.

 

Think like an engineer

 

An engineering risk analyst thinks in terms of systems, their functional components and their dependencies, Paté-Cornell said. For instance, in many power plants that require cooling, generators, turbines, water pumps, safety valves and more, all contribute to making the system work. Therefore, the analyst must first understand the ways in which the system works as a whole to identify how it could fail. The same method applies to medical systems, financial or ecological systems.

 

Paté-Cornell stresses the importance of accounting for dependent events whose probabilities are intertwined, to create a complete list of scenarios -- including the dependencies -- that must be accounted for in the risk analysis. It is, therefore, essential that engineering risk analysis include external factors that can affect the whole system, Paté-Cornell said.

 

In the case of a nuclear plant, the seismic activity or the potential for tsunamis in the area must be part of the equation, particularly if local earthquakes have historically led to tidal waves and destructive flooding. Paté-Cornell explained that the designers of the Fukushima Daiichi Nuclear Power Plant ignored important historical precedents, including two earthquakes in 869 and 1611 that generated waves similar to those witnessed in March of 2011.

 

What some described as a "perfect storm" of compounding mishaps Paté-Cornell sees as failure to assess basic failure probabilities based on experience and elementary logic.

 

A versatile framework

 

Engineering risk analyses can get complex, but their components are concrete objects whose mechanisms are usually well understood. Paté-Cornell says that this systematic approach is relevant to human aspects of risk analysis.

 

"Some argue that in engineering you have hard data about hard systems and hard architectures, but as soon as you involve human beings, you cannot apply the same methods due to the uncertainties of human error. I do not believe this is true," she said.

 

In fact, Paté-Cornell and her colleagues have long incorporated "soft" elements into their systems analysis to calculate the probability of human error. They look at all the people with access to the system, and factor in any available information about past behaviors, training and skills. Paté-Cornell has found that human errors, far from being unpredictable, are often rooted in the way an organization is managed.

 

"We look at how the management has trained, informed, and given incentives to people to do what they do and assign risk based on those assessments," she said.

 

Paté-Cornell has successfully applied this approach to the field of finance, estimating the probability that an insurance company would fail given its age and its size. She said the companies contacted her and funded the research because they needed forward-looking models that their financial analysts generally did not provide.

 

Traditional financial analysis, she said, is based on evaluating existing statistical data about past events. In her view, analysts can better anticipate market failures -- like the financial crisis that began in 2008 -- by recognizing precursors and warning signs, and factoring them into a systemic probabilistic analysis.

 

Medical specialists must also make decisions in the face of limited statistical data, and Paté-Cornell says the same approach is useful for calculating patient risk. She used systems analysis to assess data about anesthesia accidents -- a case in which human mistakes can create an accident chain that, if not recognized quickly, puts the patient's life in danger. Based on her result, she suggested retraining and recertification procedures for anesthesiologists to make their system safer.

 

Professor Paté-Cornell believes that the financial and medical sectors are just two of many fields that might benefit from systems analysis in uncertain, dynamic situations. "Lots of people don't like probability because they don't understand it," she said, "and they think if they don't have hard statistics, they cannot do a risk analysis. In fact, we generally do a system-based risk analysis because we do not have reliable statistics about the performance of the whole system."

 

She hopes that her probabilistic approach can replace the notions of black swans and perfect storms, making the public safer and better informed about risks. Apparently, others have this same hope.

 

"It must have struck a chord," she said, "because I already get lots of comments, responses and ideas on the subject from people around the world."

 

Story Source

 

The above story is reprinted from materials provided by Stanford School of Engineering. The original article was written by Kelly Servick, science-writing intern at the Stanford University School of Engineering.

 

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

 

Journal Reference

 

1.     Elisabeth Paté-Cornell. On “Black Swans” and “Perfect Storms”: Risk Analysis and Management When Statistics Are Not Enough. Risk Analysis, 2012; DOI: 10.1111/j.1539-6924.2011.01787.x

 

http://www.sciencedaily.com/releases/2012/11/121115133318.htm

本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘

引用
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=4892529