Prototyping interactions with Online Multimodal Repositories and Interactive Machine Learning.

MOCO(2016)

引用 0|浏览30
暂无评分
摘要
Interaction designers often use machine learning tools to generate intuitive mappings between complex inputs and outputs. These tools are usually trained live, which is not always feasible or practical. We combine RepoVizz, an online repository and visualizer for multimodal data, with a suite of Interactive Machine Learning tools, to demonstrate a technical solution for prototyping multimodal interactions that decouples the data acquisition step from the model training step. This way, different input data set-ups can be easily replicated, shared and experimented upon their capability to control complex output without the need to repeat the technical set-up.
更多
查看译文
关键词
multimodal data,interactive machine learning,online repositories
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要