safety,accidents,andhumanerror内容摘要:

ult of these problems – Pilot error blamed on over 70% of airplane accidents – Operator error blamed on over 60% of nuclear power plant accidents – Doctor/Nurse errors in ICU occur at a rate of • Classifying types of error – errors of omission operator fails to perform a procedural step – errors of mission operator performs extra steps that are incorrect or performs a step incorrectly Mars Orbiter Taxonomy of Human Error Interpretation Situation Assessment Plan Intention of Action Action Execution Stimulus Evidence Memory MISTAKES SLIPS LAPSES amp。 MODE ERRORS Knowledge Rule Taxonomy of Human Error Mistakes • Mistakes – failure to e up with appropriate solution • Takes place at level of perception, memory, or cognition • Knowledgebased Mistakes – wrong solution because individual did not accurately assess the situation. • Caused by poor heuristics/biases, insufficient info, info overload • Rulebased Mistakes – invoking wrong rule for given situation • Often made with confidence • Slips – Right intention incorrectly executed (oops!) • Capture errors – similar situation elicits action, which may be wrong in “this” situation. Likely to result when: • Intended action is similar to routine behavior • Hitting enter key when software asks, “sure you want to exit without saving?” • Either stimulus or response is related to incorrect response • Hit “3” instead of “” on phone to hear next message, because “3” is what I hit to hear the first message • Response is relatively automated, not monitored by consciousness • Restarting your car while the engine is already running Taxonomy of Human Error Slips • Lapses – failure to carry out an action • Error of Omission (working memory) • Examples: Fetting to close gas cap, failure to put safety on before cleaning gun, failure to remove objects from surgical patient • Mode Errors – Making the right response, but while in the wrong mode of operation • Examples: leave keyboard in shift mode while trying to type a numeral, driving in wrong gear, going wrong direction because display was northup when thought it was noseup Taxonomy of Human Error Lapses amp。 Mode Errors Human Reliability Analysis • Human Reliability Analysis – predict reliability of system in terms of probability of failure or mean time between failures (MTBF) when system is designed to work in parallel or series .9 .9 .9 .9 Series Parallel Reliability = .9 x .9 = .81 P(failure) = 1 .81 = .19 Reliability = 1 – [(1 .9) (1 .9)] = 1 .01 = .99 P(failure) = 1 .99 = .01 (see homework) THERP ponents 1. Human Error Probability • Ratio of errors made to possible errors 2. Event Tree • Diagram showing sequence of events • Proba。
阅读剩余 0%
本站所有文章资讯、展示的图片素材等内容均为注册用户上传(部分报媒/平媒内容转载自网络合作媒体),仅供学习参考。 用户通过本站上传、发布的任何内容的知识产权归属用户或原始著作权人所有。如有侵犯您的版权,请联系我们反馈本站将在三个工作日内改正。