Compressed Sensing using Generative Models.

ICML(2017)

引用 822|浏览147
暂无评分
摘要
The goal of compressed sensing is estimate a vector from an underdetermined system of noisy linear measurements, by making use of prior knowledge on the structure of vectors in the relevant domain. For almost all results in this literature, the structure is represented by sparsity in a well-chosen basis. We show how achieve guarantees similar standard compressed sensing but without employing sparsity at all. Instead, we suppose that vectors lie near the range of a generative model $G: mathbb{R}^k to mathbb{R}^n$. Our main theorem is that, if $G$ is $L$-Lipschitz, then roughly $O(k log L)$ random Gaussian measurements suffice for an $ell_2/ell_2$ recovery guarantee. We demonstrate our results using generative models from published variational autoencoder and generative adversarial networks. Our method can use $5$-$10$x fewer measurements than Lasso for the same accuracy.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要