|
[1] M. Soleymani, D. Garcia, B. Jou, B. Schuller, S.-F. Chang, and M. Pantic, ‘A survey of multimodal sentiment analysis’, Image and Vision Computing, vol. 65, pp. 3–14, Sep. 2017, doi: 10.1016/j.imavis.2017.08.003. [2] V. S. Pagolu, K. N. R. Challa, G. Panda, and B. Majhi, ‘Sentiment Analysis of Twitter Data for Predicting Stock Market Movements’, arXiv:1610.09225 [cs], Oct. 2016, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1610.09225. [3] S. Mohammad, S. Kiritchenko, and X. Zhu, ‘NRC-Canada: Building the State-of-the-Art in Sentiment Analysis of Tweets’, in Second Joint Conference on Lexical and Computational Semantics (*SEM), Volume 2: Proceedings of the Seventh International Workshop on Semantic Evaluation (SemEval 2013), Atlanta, Georgia, USA, Jun. 2013, pp. 321–327. [4] A. Ratnaparkhi, ‘A Maximum Entropy Model for Part-Of-Speech Tagging’, 1996, Accessed: Aug. 09, 2020. [Online]. Available: https://www.aclweb.org/anthology/W96-0213. [5] T. Mullen and N. Collier, ‘Sentiment Analysis using Support Vector Machines with Diverse Information Sources’, in Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing, Barcelona, Spain, Jul. 2004, pp. 412–418, Accessed: Aug. 09, 2020. [Online]. Available: https://www.aclweb.org/anthology/W04-3253. [6] Y. Kim, ‘Convolutional Neural Networks for Sentence Classification’, arXiv:1408.5882 [cs], Sep. 2014, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1408.5882. [7] K. Greff, R. K. Srivastava, J. Koutnik, B. R. Steunebrink, and J. Schmidhuber, ‘LSTM: A Search Space Odyssey’, IEEE Trans. Neural Netw. Learning Syst., vol. 28, no. 10, pp. 2222–2232, Oct. 2017, doi: 10.1109/TNNLS.2016.2582924. [8] Szu-Hung Wang, ‘Sentiment-Guided Attention Mechanism for Sentiment Analysis’. National Taiwan University Graduate Institute of Networking and Multimedia, Jan. 01, 2019. [9] A. Vaswani et al., ‘Attention Is All You Need’, arXiv:1706.03762 [cs], Dec. 2017, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1706.03762. [10] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, ‘BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding’, arXiv:1810.04805 [cs], May 2019, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1810.04805. [11] S. J. Pan and Q. Yang, ‘A Survey on Transfer Learning’, IEEE Trans. Knowl. Data Eng., vol. 22, no. 10, pp. 1345–1359, Oct. 2010, doi: 10.1109/TKDE.2009.191. [12] M. Hu and B. Liu, ‘Mining and Summarizing Customer Reviews’, p. 10. [13] J. Barnes, R. Klinger, and S. S. im Walde, ‘Assessing State-of-the-Art Sentiment Models on State-of-the-Art Sentiment Datasets’, arXiv:1709.04219 [cs], Sep. 2017, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1709.04219. [14] B. Felbo, A. Mislove, A. Søgaard, I. Rahwan, and S. Lehmann, ‘Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm’, Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 1615–1625, 2017, doi: 10.18653/v1/D17-1169. [15] D. Bahdanau, K. Cho, and Y. Bengio, ‘Neural Machine Translation by Jointly Learning to Align and Translate’, arXiv:1409.0473 [cs, stat], May 2016, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1409.0473. [16] I. Sutskever, O. Vinyals, and Q. V. Le, ‘Sequence to Sequence Learning with Neural Networks’, in Advances in Neural Information Processing Systems 27, Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, and K. Q. Weinberger, Eds. Curran Associates, Inc., 2014, pp. 3104–3112. [17] B. Shin, T. Lee, and J. D. Choi, ‘Lexicon Integrated CNN Models with Attention for Sentiment Analysis’, arXiv:1610.06272 [cs], Aug. 2017, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1610.06272. [18] J. Yoon and H. Kim, ‘Multi-Channel Lexicon Integrated CNN-BiLSTM Models for Sentiment Analysis’, p. 10. [19] R. Socher et al., ‘Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank’, in Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, Seattle, Washington, USA, Oct. 2013, pp. 1631–1642, Accessed: Aug. 09, 2020. [Online]. Available: https://www.aclweb.org/anthology/D13-1170. [20] T. Wolf et al., ‘HuggingFace’s Transformers: State-of-the-art Natural Language Processing’, arXiv:1910.03771 [cs], Jul. 2020, Accessed: Aug. 09, 2020. [Online]. Available: http://arxiv.org/abs/1910.03771. [21] A. Paszke et al., ‘PyTorch: An Imperative Style, High-Performance Deep Learning Library’, in Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d\textquotesingle Alché-Buc, E. Fox, and R. Garnett, Eds. Curran Associates, Inc., 2019, pp. 8026–8037. [22] M. Munikar, S. Shakya, and A. Shrestha, ‘Fine-grained Sentiment Classification using BERT’, in 2019 Artificial Intelligence for Transforming Business and Society (AITB), Kathmandu, Nepal, Nov. 2019, pp. 1–5, doi: 10.1109/AITB48515.2019.8947435.
|