Inferring population dynamics in macaque cortex

Ganga Meghanath, Bryan Jimenez,Joseph G. Makin

JOURNAL OF NEURAL ENGINEERING(2023)

引用 0|浏览0
暂无评分
摘要
Objective. The proliferation of multi-unit cortical recordings over the last two decades, especially in macaques and during motor-control tasks, has generated interest in neural 'population dynamics': the time evolution of neural activity across a group of neurons working together. A good model of these dynamics should be able to infer the activity of unobserved neurons within the same population and of the observed neurons at future times. Accordingly, Pandarinath and colleagues have introduced a benchmark to evaluate models on these two (and related) criteria: four data sets, each consisting of firing rates from a population of neurons, recorded from macaque cortex during movement-related tasks. Approach. Since this is a discriminative-learning task, we hypothesize that general-purpose architectures based on recurrent neural networks (RNNs) trained with masking can outperform more 'bespoke' models. To capture long-distance dependencies without sacrificing the autoregressive bias of recurrent networks, we also propose a novel, hybrid architecture ('TERN') that augments the RNN with self-attention, as in transformer networks. Main results. Our RNNs outperform all published models on all four data sets in the benchmark. The hybrid architecture improves performance further still. Pure transformer models fail to achieve this level of performance, either in our work or that of other groups. Significance. We argue that the autoregressive bias imposed by RNNs is critical for achieving the highest levels of performance, and establish the state of the art on the neural latents benchmark. We conclude, however, by proposing that the benchmark be augmented with an alternative evaluation of latent dynamics that favors generative over discriminative models like the ones we propose in this report.
更多
查看译文
关键词
neural population dynamics,motor cortex,neural latents benchmark,multielectrode arrays,spiking data,RNNs,self-attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要