Example-based brushes for coherent stylized renderings

NPAR(2017)

引用 7|浏览54
暂无评分
摘要
Painterly stylization is the cornerstone of non-photorealistic rendering. Inspired by the versatility of paint as a physical medium, existing methods target intuitive interfaces that mimic physical brushes, providing artists the ability to intuitively place paint strokes in a digital scene. Other work focuses on physical simulation of the interaction between paint and paper or realistic rendering of wet and dry paint. In our work, we leverage the versatility of example-based methods that can generate paint strokes of arbitrary shape and style based on a collection of images acquired from physical media. Such ideas have gained popularity since they do not require cumbersome physical simulation and achieve high fidelity without the need of a specific model or rule set. However, existing methods are limited to the generation of static 2D paintings and cannot be applied in the context of 3D painting and animation where paint strokes change shape and length as the camera viewport moves. Our method targets this shortcoming by generating temporally-coherent example-based paint strokes that accommodate to such length and shape changes. We demonstrate the robustness of our method with a 2D painting application that provides immediate feedback to the user and show how our brush model can be applied to the screen-space rendering of 3D paintings on a variety of examples.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要