PRISE: Learning Temporal Action Abstractions as a Sequence Compression Problem
CoRR(2024)
摘要
Temporal action abstractions, along with belief state representations, are a
powerful knowledge sharing mechanism for sequential decision making. In this
work, we propose a novel view that treats inducing temporal action abstractions
as a sequence compression problem. To do so, we bring a subtle but critical
component of LLM training pipelines – input tokenization via byte pair
encoding (BPE) – to the seemingly distant task of learning skills of variable
time span in continuous control domains. We introduce an approach called
Primitive Sequence Encoding (PRISE) that combines continuous action
quantization with BPE to learn powerful action abstractions. We empirically
show that high-level skills discovered by PRISE from a multitask set of robotic
manipulation demonstrations significantly boost the performance of both
multitask imitation learning as well as few-shot imitation learning on unseen
tasks. Our code will be released at https://github.com/FrankZheng2022/PRISE.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要