AI helps you reading Science
AI generates interpretation videos
AI extracts and analyses the key points of the paper to generate videos automatically
AI parses the academic lineage of this thesis
AI extracts a summary of this paper
We presented Deep Cooperative Neural Networks which exploits the information exists in the reviews for recommender systems
Joint Deep Modeling of Users and Items Using Reviews for Recommendation.
WSDM, (2017): 425-433
A large amount of information exists in reviews written by users. This source of information has been ignored by most of the current recommender systems while it can potentially alleviate the sparsity problem and improve the quality of recommendations. In this paper, we present a deep model to learn item properties and user behaviors join...More
PPT (Upload PPT)
- The variety and number of products and services provided by companies have increased dramatically during the last decade.
- Companies produce a large number of products to meet the needs of customers.
- This gives more.
- Many of the most successful CF techniques are based on matrix factorization .
- They find common factors that can be the underlying reasons for the ratings given by users.
- Matrix factorization techniques find these hidden factors, and give the importance of them for each user and how each item satisfies each factor
- The variety and number of products and services provided by companies have increased dramatically during the last decade
- We propose a Neural Network (NN) based model, named Deep Cooperative Neural Networks (DeepCoNN), to model users and items jointly using review text for rating prediction problems
- In Figure 2, we show the performance of DeepCoNN on the validation set of Yelp with varying |xu| and |yi| from 5 to 100 and n1 from 10 to 400 to investigate its sensitivity
- It is shown that reviews written by users can reveal some info on the customer buying and rating behavior, and reviews written for items may contain info on their features and properties
- We presented Deep Cooperative Neural Networks (DeepCoNN) which exploits the information exists in the reviews for recommender systems
- 8.3% improvement is attained by the proposed model on all three datasets
- DeepCoNN models user behaviors and item properties using reviews
- It learns hidden latent factors for users and items by exploiting review text such that the learned factors can estimate the ratings given by users.
- It is done with a CNN based model consisting of two parallel neural networks, coupled to each other with a shared layer at the top.
- It is shown that reviews written by users can reveal some info on the customer buying and rating behavior, and reviews written for items may contain info on their features and properties.
- DeepCoNN consists of two deep neural networks coupled together by a shared common layer to model users and items from the reviews.
- It makes the user and item representations mapped into a common feature space.
- In comparison with state-of-the-art baselines, DeepCoNN achieved 8.5% and 7.6% improvements on datasets of Yelp and Beer, respectively.
- 8.3% improvement is attained by the proposed model on all three datasets
- Table1: Notations
- Table2: The Statistics of the datasets
- Table3: MSE Comparison with baselines. Best results are indicated in bold
- Table4: Comparing variants of the proposed model. Best results are indicated in bold
- There are two categories of studies related to our work: techniques that model users and/or items by exploiting the information in online review text, and deep learning techniques employed for recommender systems. In this section, we give a short review of these two research areas and distinguish our work from the existing approaches.
The first studies that used online review text in rating prediction tasks were mostly focused on predicting ratings for an existing review [2, 38], while in our paper, we predict the ratings from the history of review text written by a user to recommend desirable products to that user.
One of the pioneer works that explored using reviews to improve the rating prediction is presented in . It found that reviews are usually related to different aspects, e.g., price, service, positive or negative feelings, that can be exploited for rating prediction. In , the authors proposed Hidden Factors as Topics (HFT) to employ topic modeling techniques to discover latent aspects from either item or user reviews. This method achieves significant improvement compared to models which only use ratings or reviews. A similar approach is followed in  with the main difference that it models user’s and items’ reviews simultaneously. In , a probabilistic model is proposed based on collaborative filtering and topic modeling. It uncovers aspects and sentiments of users and items, but it does not incorporate ratings during modeling reviews. Ratings Meet Reviews (RMR)  also tries to harness the information of both ratings and reviews. One difference between HFT and RMR is that RMR applies topic modeling techniques on item review text and aligns the topics with the rating dimensions to improve prediction accuracy.
- This work is supported in part by NSF through grants IIS-1526499, and CNS-1626432
- We gratefully acknowledge the support of NVIDIA Corporation with the donation of the Titan X GPU used for this research
- A. Almahairi, K. Kastner, K. Cho, and A. Courville. Learning distributed representations from reviews for collaborative filtering. In Proceedings of the 9th ACM Conference on Recommender Systems, pages 147–154. ACM, 2015.
- S. Baccianella, A. Esuli, and F. Sebastiani. Multi-facet rating of product reviews. In Advances in Information Retrieval, pages 461–472.
- Y. Bao, H. Fang, and J. Zhang. Topicmf: Simultaneously exploiting ratings and reviews for recommendation. In AAAI, pages 2–8. AAAI Press, 2014.
- Y. Bengio, H. Schwenk, J.-S. Senecal, F. Morin, and J.-L. Gauvain. Neural probabilistic language models. In Innovations in Machine Learning, pages 137–186.
- D. M. Blei, A. Y. Ng, and M. I. Jordan. Latent dirichlet allocation. the Journal of machine Learning research, 3:993–1022, 2003.
- L. Chen, G. Chen, and F. Wang. Recommender systems based on user reviews: the state of the art. User Modeling and User-Adapted Interaction, 25(2):99–154, 2015.
- R. Collobert, J. Weston, L. Bottou, M. Karlen, K. Kavukcuoglu, and P. Kuksa. Natural language processing (almost) from scratch. The Journal of Machine Learning Research, 12:2493–2537, 2011.
- Q. Diao, M. Qiu, C. Wu, A. J. Smola, J. Jiang, and C. Wang. Jointly modeling aspects, ratings and sentiments for movie recommendation (JMARS). In KDD, pages 193–202. ACM, 2014.
- A. M. Elkahky, Y. Song, and X. He. A multi-view deep learning approach for cross domain user modeling in recommendation systems. In Proceedings of the 24th International Conference on World Wide Web, pages 278–288. International World Wide Web Conferences Steering Committee, 2015.
- N. Jakob, S. H. Weber, M. C. Muller, and I. Gurevych. Beyond the stars: exploiting free-text user reviews to improve the accuracy of movie recommendations. In Proceedings of the 1st international CIKM workshop on Topic-sentiment analysis for mass opinion, pages 57–64. ACM, 2009.
- R. Johnson and T. Zhang. Effective use of word order for text categorization with convolutional neural networks. In HLT-NAACL, pages 103–112. The Association for Computational Linguistics, 2015.
- Y. Kim. Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882, 2014.
- Y. Koren, R. Bell, and C. Volinsky. Matrix factorization techniques for recommender systems. Computer, (8):30–37, 2009.
- A. Krizhevsky, I. Sutskever, and G. E. Hinton. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems, pages 1097–1105, 2012.
- S. Li, J. Kawale, and Y. Fu. Deep collaborative filtering via marginalized denoising auto-encoder. In Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, pages 811–820. ACM, 2015.
- G. Ling, M. R. Lyu, and I. King. Ratings meet reviews, a combined approach to recommend. In Proceedings of the 8th ACM Conference on Recommender systems, pages 105–112. ACM, 2014.
- J. McAuley and J. Leskovec. Hidden factors and hidden topics: understanding rating dimensions with review text. In Proceedings of the 7th ACM conference on Recommender systems, pages 165–172. ACM, 2013.
- J. McAuley, J. Leskovec, and D. Jurafsky. Learning attitudes and attributes from multi-aspect reviews. In Data Mining (ICDM), 2012 IEEE 12th International Conference on, pages 1020–1025. IEEE, 2012.
- J. McAuley, R. Pandey, and J. Leskovec. Inferring networks of substitutable and complementary products. In Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 785–794. ACM, 2015.
- T. Mikolov, M. Karafiat, L. Burget, J. Cernocky, and S. Khudanpur. Recurrent neural network based language model. In INTERSPEECH, pages 1045–1048, 2010.
- T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado, and J. Dean. Distributed representations of words and phrases and their compositionality. In Advances in neural information processing systems, pages 3111–3119, 2013.
- V. Nair and G. E. Hinton. Rectified linear units improve restricted boltzmann machines. In Proceedings of the 27th International Conference on Machine Learning (ICML-10), pages 807–814, 2010.
- R. Pan, Y. Zhou, B. Cao, N. N. Liu, R. Lukose, M. Scholz, and Q. Yang. One-class collaborative filtering. In Data Mining, 2008. ICDM’08. Eighth IEEE International Conference on, pages 502–511. IEEE, 2008.
- S. Rendle. Factorization machines with libfm. ACM Transactions on Intelligent Systems and Technology (TIST), 3(3):57, 2012.
- R. Salakhutdinov and A. Mnih. Probabilistic matrix factorization. In NIPS, pages 1257–1264. Curran Associates, Inc., 2007.
- R. Salakhutdinov, A. Mnih, and G. Hinton. Restricted boltzmann machines for collaborative filtering. In Proceedings of the 24th international conference on Machine learning, pages 791–798. ACM, 2007.
- A. I. Schein, A. Popescul, L. H. Ungar, and D. M. Pennock. Methods and metrics for cold-start recommendations. In Proceedings of the 25th annual international ACM SIGIR conference on Research and development in information retrieval, pages 253–260. ACM, 2002.
- N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov. Dropout: A simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research, 15(1):1929–1958, 2014.
- Theano Development Team. Theano: A Python framework for fast computation of mathematical expressions. arXiv e-prints, abs/1605.02688, May 2016.
- T. Tieleman and G. Hinton. Lecture 6.5-rmsprop: Divide the gradient by a running average of its recent magnitude. COURSERA: Neural Networks for Machine Learning, 4:2, 2012.
- A. Van den Oord, S. Dieleman, and B. Schrauwen. Deep content-based music recommendation. In Advances in Neural Information Processing Systems, pages 2643–2651, 2013.
- H. M. Wallach. Topic modeling: beyond bag-of-words. In Proceedings of the 23rd international conference on Machine learning, pages 977–984. ACM, 2006.
- C. Wang and D. M. Blei. Collaborative topic modeling for recommending scientific articles. In Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 448–456. ACM, 2011.
- H. Wang, Y. Lu, and C. Zhai. Latent aspect rating analysis on review text data: a rating regression approach. In Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 783–792. ACM, 2010.
- H. Wang, N. Wang, and D.-Y. Yeung. Collaborative deep learning for recommender systems. In Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 1235–1244. ACM, 2015.
- X. Wang and Y. Wang. Improving content-based and hybrid music recommendation using deep learning. In Proceedings of the ACM International Conference on Multimedia, pages 627–6ACM, 2014.
- Y. Wu, C. DuBois, A. X. Zheng, and M. Ester. Collaborative denoising auto-encoders for top-n recommender systems.
- Y. Wu and M. Ester. FLAME: A probabilistic model combining aspect based opinion mining and collaborative filtering. In WSDM, pages 199–208. ACM, 2015.