Integrating GPT-Technologies with Decision Models for Explainability.

xAI (3)(2023)

引用 0|浏览1
暂无评分
摘要
The ability to provide clear and transparent explanations for the outcome of a decision is critical for gaining user trust and acceptance, particularly in areas such as healthcare, finance, and law. While GPT-3 and ChatGPT are promising technologies for conversations, they cannot always be used to provide the correct outcome of operational decisions and to give explanations on questions about these decisions. The decision model logic can be employed to explain the decision-making process through various reasoning mechanisms. It is possible to provide automated reasoning and explanations for decisions with the use of chatbots powered by these decision models. However, these chatbots are not always user-friendly as users may struggle to determine the appropriate reasoning and explanation scenario for their questions. This paper explores the potential of GPT-3 technology to identify the appropriate reasoning and explanation scenario for decision-making in chatbots and compares its performance with directly asking the user questions. With GPT-3 technology’s ability to identify the appropriate scenario, it will facilitate the development of integrated GPT-3 and decision model powered chatbots that can provide correct and human-understandable explanations for operational decisions.
更多
查看译文
关键词
explainability,decision models,gpt-technologies
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要