Language representations in L2 learners: Toward neural models

Proceedings of the Annual Meeting of the Cognitive Science Society(2021)

引用 0|浏览0
暂无评分
摘要
Author(s): Tang, Zixin; Putnam, Michael; Reitter, David T | Abstract: We investigated how the language background (L1) of bilinguals influences the representation and use of the second language (L2) through computational models. With the essays part from The International Corpus Network of Asian Learners of English (ICNALE), we compared variables indicating syntactic complexity in their L2 production to predict L1. We then trained neural language models based on BERT to predict the L1 of these English learners. Results showed the systematic influence of L1 syntax properties on English learners' L2 production, which further confirmed integrations of syntactic knowledge across languages in bilingual speakers. Results also found neural models can learn to represent and detect such L1 impacts, while multilingually trained models have no advantage in doing so.
更多
查看译文
关键词
l2 learners,language representations
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要