Modes of Information Flow.

Ryan G. James, Blanca Daniella Mansante Ayala,Bahti Zakirov,James P. Crutchfield

arXiv: Statistical Mechanics(2018)

引用 23|浏览16
暂无评分
摘要
Information flow between components of a system takes many forms and is key to understanding the organization and functioning of large-scale, complex systems. We demonstrate three modalities of information flow from time series X to time series Y. Intrinsic information flow exists when the past of X is individually predictive of the present of Y, independent of Yu0027s past; this is most commonly considered information flow. Shared information flow exists when Xu0027s past is predictive of Yu0027s present in the same manner as Yu0027s past; this occurs due to synchronization or common driving, for example. Finally, synergistic information flow occurs when neither Xu0027s nor Yu0027s pasts are predictive of Yu0027s present on their own, but taken together they are. The two most broadly-employed information-theoretic methods of quantifying information flow---time-delayed mutual information and transfer entropy---are both sensitive to a pair of these modalities: time-delayed mutual information to both intrinsic and shared flow, and transfer entropy to both intrinsic and synergistic flow. To quantify each mode individually we introduce our cryptographic flow ansatz, positing that intrinsic flow is synonymous with secret key agreement between X and Y. Based on this, we employ an easily-computed secret-key-agreement bound---intrinsic mutual informationu0026mdashto quantify the three flow modalities in a variety of systems including asymmetric flows and financial markets.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要