Experience
Education
Bio
On previous research, I focused on of language sequence modelling using Recurrent Neural Networks that incorporated hierarchical representations, gated attention, uncertainty propagation, stacked residual learning, context-dependent and structure-dependent models in order to improve the precision, efficiency and interpretability aspects of current RNN architectures. At the moment, I am particularly interested in efficient and accurate models for text processing in low-to-medium resource scenarios with applications to dialog systems and sequence modelling.