Lifelong Learning for Text Steganalysis Based on Chronological Task Sequence.

Juan Wen,Yaqian Deng, Jiaxuan Wu, Xingpeng Liu,Yiming Xue

IEEE Signal Process. Lett.(2022)

引用 1|浏览14
暂无评分
摘要
The prevailing text steganalysis models deal with only one domain of steganographic text. When learning steganographic text from other domains, it will forget the previously learned features, resulting in degraded model performance on the previous domain. We create a chronological task sequence and draw on the ideas of lifelong learning to propose a novel steganalysis method. We use BERT to extract the common features shared among the current task and the previous tasks. Then, we extract explicit and latent features based on two learning scenarios: next sentence prediction and task ID classification, which are constrained by regularization to prevent the feature space from changing too much when switching tasks. And we design the replay mechanism and three loss functions to make a trade-off between the previous and the current tasks. Furthermore, we evaluate our model under hybrid steganography, which detects a mixture of texts generated by different steganography algorithms with different embedding capabilities. Extensive experiments show that our model mitigates the catastrophic forgetting issues and outperforms the state-of-the-art models in continual text steganalysis tasks.
更多
查看译文
关键词
text steganalysis,learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要