
views: 11
Quan Hung Tran
Research
Sign in to view more

Ego Network
D-Core
Research Interests
Author Statistics
Experience
Sign in to view more
Education
Sign in to view more
Bio
On previous research, I focused on of language sequence modelling using Recurrent Neural Networks that incorporated hierarchical representations, gated attention, uncertainty propagation, stacked residual learning, context-dependent and structure-dependent models in order to improve the precision, efficiency and interpretability aspects of current RNN architectures. At the moment, I am particularly interested in efficient and accurate models for text processing in low-to-medium resource scenarios with applications to dialog systems and sequence modelling.
Papers30 papers
Sort
By YearBy Citation
Australian Economic Papers, (2020)
EMNLP, pp.4543-4548, (2020)
COLING, pp.649-656, (2020)
Xuanli He,Quan Hung Tran,Gholamreza Haffari, Walter Chang,Trung Bui,Zhe Lin,Franck Dernoncourt, Nhan Dam
EMNLP, pp.972-990, (2020)
european conference on computer vision, pp.89-106, (2020)
CVPR, pp.3437-3447, (2020)
International Journal of Organizational Analysis, (2020)
COLING, pp.3285-3301, (2020)
(2019)
ICASSP, pp.8034-8038, (2019)
ALTA, pp.94-99, (2019)
Public Organization Review, pp.1-14, (2019)
Bibtex
Journal of Services Marketing, (2019)
Bibtex
EMNLP/IJCNLP (1), pp.5952-5958, (2019)
Cited by6EIBibtex
NAACL-HLT, pp.1274-1283, (2018)
arXiv: Computation and Language, (2018)
Public Organization Review, no. 2 (2018): 1-16
international joint conference on natural language processing, (2017)
ACL, pp.524-529, (2017)
LaCATODA@IJCAI, pp.20-27, (2017)
View All