## AI helps you reading Science

## AI Insight

AI extracts a summary of this paper

Weibo:

# A memory frontier for complex synapses.

NIPS, pp.1034-1042, (2013)

EI

Abstract

An incredible gulf separates theoretical models of synapses, often described solely by a single scalar value denoting the size of a postsynaptic potential, from the immense complexity of molecular signaling pathways underlying real synapses. To understand the functional contribution of such molecular complexity to learning and memory, it ...More

Code:

Data:

Introduction

- It is widely thought that the very ability to remember the past over long time scales depends crucially on the ability to modify synapses in the brain in an experience dependent manner.
- In (3) the authors have described an analytic expression for a memory curve as a function of the structure of a synaptic dynamical system, described by the pair of stochastic transition matrices Mpot/dep.

Highlights

- It is widely thought that our very ability to remember the past over long time scales depends crucially on our ability to modify synapses in our brain in an experience dependent manner
- Classical models of synaptic plasticity model synaptic efficacy as an analog scalar value, denoting the size of a postsynaptic potential injected into one neuron from another
- Theoretical work has shown that such models have a reasonable, extensive memory capacity, in which the number of long term associations that can be stored by a neuron is proportional its number of afferent synapses [1,2,3]
- Along the way we develop principles based on first passage time theory to order the structure of synaptic dynamical systems and relate this structure to memory performance
- We have obtained several new mathematical results delineating the functional limits of memory achievable by synaptic complexity, and the structural characterization of synaptic dynamical systems that achieve these limits
- Operating within the ideal observer framework of [10, 11, 18], we have shown that for a popula√tion of N synapses with M internal states, (a) the initial signal-to-noise ratio (SNR) of any synaptic model cannot exceed N, and any model that achieves this bound is equivalent to a binary synapse, (b) the area under the memory curve of any model cannot exceed that of a linear chain model with the s√ame equilibrium distribution, (c) both the area and memory lifetime of any model cannot exceed O( N M ), and the model that achieves this limit has a linear chain topology with only nearest neighbor transitions, (d) we have derived an envelope memory curve in the SNR-time plane that cannot be exceeded by the memory curve of any model, and models that approach this envelope for times greater are linear chain models, and (e) this late-time envelope is a power-law proportional to O( N M /rt), indicating that synaptic complexity can strongly enhance the limits of achievable memory

Results

- The authors' end goal, achieved in §4, is to derive an envelope memory curve in the SNR-time plane, or a curve that forms an upper-bound on the entire memory curve of any model.
- In order to achieve this goal, the authors must first derive upper bounds, over the space of all possible synaptic models, on two different scalar functions of the memory curve: its initial SNR, and the area under the memory curve.
- In the process of upper-bounding the area, the authors will develop an essential framework to organize the structure of synaptic dynamical systems based on first passage time theory.
- The authors will see that synaptic models that optimize various measures of memory have an exceedingly simple structure when, and only when, their states are arranged in this order.1
- The red curves in Figure 4(a) show the closest the authors have come to the envelope with actual models, by repeated numerical optimization of SNR(t0) over Mpot/dep with random initialization and by hand designed models.

Conclusion

- The authors have initiated the development of a general theory of learning and memory with complex synapses, allowing for an exploration of the entire space of complex synaptic models, rather than (a) 101 (b) envelope numerical search hand designed 10−1 (c) ε
- Operating within the ideal observer framework of [10, 11, 18], the authors have shown that for a popula√tion of N synapses with M internal states, (a) the initial SNR of any synaptic model cannot exceed N , and any model that achieves this bound is equivalent to a binary synapse, (b) the area under the memory curve of any model cannot exceed that of a linear chain model with the s√ame equilibrium distribution, (c) both the area and memory lifetime of any model cannot exceed O( N M ), and the model that achieves this limit has a linear chain topology with only nearest neighbor transitions, (d) the authors have derived an envelope memory curve in the SNR-time plane that cannot be exceeded by the memory curve of any model, and models that approach this envelope for times greater are linear chain models, and (e) this late-time envelope is a power-law proportional to O( N M /rt), indicating that synaptic complexity can strongly enhance the limits of achievable memory.

Reference

- J. J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proc. Natl. Acad. Sci. U.S.A. 79 (1982) no. 8, 2554–2558.
- D. J. Amit, H. Gutfreund, and H. Sompolinsky, “Spin-glass models of neural networks,” Phys. Rev. A 32 (Aug, 1985) 1007–1018.
- E. Gardner, “The space of interactions in neural network models,” Journal of Physics A: Mathematical and General 21 (1988) no. 1, 257.
- T. V. P. Bliss and G. L. Collingridge, “A synaptic model of memory: long-term potentiation in the hippocampus,” Nature 361 (Jan, 1993) 31–39.
- C. C. H. Petersen, R. C. Malenka, R. A. Nicoll, and J. J. Hopfield, “All-or-none potentiation at CA3-CA1 synapses,” Proc. Natl. Acad. Sci. U.S.A. 95 (1998) no. 8, 4732–4737.
- D. H. O’Connor, G. M. Wittenberg, and S. S.-H. Wang, “Graded bidirectional synaptic plasticity is composed of switch-like unitary events,” Proc. Natl. Acad. Sci. U.S.A. 102 (2005) no. 27, 9679–9684.
- R. Enoki, Y. ling Hu, D. Hamilton, and A. Fine, “Expression of Long-Term Plasticity at Individual Synapses in Hippocampus Is Graded, Bidirectional, and Mainly Presynaptic: Optical Quantal Analysis,” Neuron 62 (2009) no. 2, 242 – 253.
- D. J. Amit and S. Fusi, “Constraints on learning in dynamic synapses,” Network: Computation in Neural Systems 3 (1992) no. 4, 443–464.
- D. J. Amit and S. Fusi, “Learning in neural networks with material synapses,” Neural Computation 6 (1994) no. 5, 957–982.
- S. Fusi, P. J. Drew, and L. F. Abbott, “Cascade models of synaptically stored memories,” Neuron 45 (Feb, 2005) 599–611.
- S. Fusi and L. F. Abbott, “Limits on the memory storage capacity of bounded synapses,” Nat. Neurosci. 10 (Apr, 2007) 485–493.
- C. Leibold and R. Kempter, “Sparseness Constrains the Prolongation of Memory Lifetime via Synaptic Metaplasticity,” Cerebral Cortex 18 (2008) no. 1, 67–77.
- D. S. Bredt and R. A. Nicoll, “AMPA Receptor Trafficking at Excitatory Synapses,” Neuron 40 (2003) no. 2, 361 – 379.
- M. P. Coba, A. J. Pocklington, M. O. Collins, M. V. Kopanitsa, R. T. Uren, S. Swamy, M. D. Croning, J. S. Choudhary, and S. G. Grant, “Neurotransmitters drive combinatorial multistate postsynaptic density networks,” Sci Signal 2 (2009) no. 68, ra19.
- W. C. Abraham and M. F. Bear, “Metaplasticity: the plasticity of synaptic plasticity,” Trends in Neurosciences 19 (1996) no. 4, 126 – 130.
- J. M. Montgomery and D. V. Madison, “State-Dependent Heterogeneity in Synaptic Depression between Pyramidal Cell Pairs,” Neuron 33 (2002) no. 5, 765 – 777.
- R. D. Emes and S. G. Grant, “Evolution of Synapse Complexity and Diversity,” Annual Review of Neuroscience 35 (2012) no. 1, 111–131.
- A. B. Barrett and M. C. van Rossum, “Optimal learning rules for discrete synapses,” PLoS Comput. Biol. 4 (Nov, 2008) e1000230.
- J. Kemeny and J. Snell, Finite markov chains. Springer, 1960.
- C. Burke and M. Rosenblatt, “A Markovian function of a Markov chain,” The Annals of Mathematical Statistics 29 (1958) no. 4, 1112–1122.
- F. Ball and G. F. Yeo, “Lumpability and Marginalisability for Continuous-Time Markov Chains,” Journal of Applied Probability 30 (1993) no. 3, 518–528.

Tags

Comments

数据免责声明

页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果，我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问，可以通过电子邮件方式联系我们：report@aminer.cn