Perceived complexity versus internal complexity Did we take into account expertise, reliability and cognitive stability?
msra(2006)
摘要
Human reliability issues in safety-critical systems, in aviation for example, motivated and still motivate the design and use of protections that can be tool-based or organizational. Software and hardware have been developed to overcome human reliability to enable both tolerance and resistance to human errors. Consequently, systems have become more complex and the distance between people and actual production machines never stopped to increase. Most of the time, the perceived complex- ity tremendously decreased when the automated product matured, sometimes after a difficult start where it was high to very high. This paper presents a synthesis on complexity and cognitive stability in human-machine systems, and more specifically in highly automated systems. It emphasize several issues such as technological complexity, complexity and expertise, reliability of machines and peo- ple, and complexity and resilience. The paper emphasizes interaction between people and highly automated safety-critical systems. What do people expect from their cooperation with their "friendly" automata? Do they need to know about their internal complexity to interact with them? How do they perceive their external complexity? What is the right level of abstraction required to interact safely, efficiently and comfortably?
更多查看译文
关键词
human error
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络