Neural Field Convolutions by Repeated Differentiation

ACM TRANSACTIONS ON GRAPHICS(2023)

引用 0|浏览30
暂无评分
摘要
Neural fields are evolving towards a general-purpose continuous representation for visual computing. Yet, despite their numerous appealing properties, they are hardly amenable to signal processing. As a remedy, we present a method to perform general continuous convolutions with general continuous signals such as neural fields. Observing that piecewise polynomial kernels reduce to a sparse set of Dirac deltas after repeated differentiation, we leverage convolution identities and train a repeated integral field to efficiently execute large-scale convolutions. We demonstrate our approach on a variety of data modalities and spatially-varying kernels.
更多
查看译文
关键词
Convolution,Geometry Processing,Image Processing,Neural Fields,Signal Processing,Sparsity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要