Experimental Study of Language Models of "Transformer" in the Problem of Finding the Answer to a Question in a Russian-Language Text
- Authors: Galeev D.T1, Panishchev V.S1
-
Affiliations:
- Southwest State University (SWSU)
- Issue: Vol 21, No 3 (2022)
- Pages: 521-542
- Section: Artificial intelligence, knowledge and data engineering
- URL: https://journals.rcsi.science/2713-3192/article/view/266351
- DOI: https://doi.org/10.15622/ia.21.3.3
- ID: 266351
Cite item
Full Text
Abstract
About the authors
D. T Galeev
Southwest State University (SWSU)
Email: ra3wvw@mail.ru
50 let Oktyabrya St. 94
V. S Panishchev
Southwest State University (SWSU)
Email: gskunk@yandex.ru
50 let Oktyabrya St. 94
References
- Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., Polosukhin, I. Attention is all you need // Advances in Neural Information Processing Systems 30. 2017. pp. 5998-6008.
- Yang Z., Keung J., Yu X., Gu X., Wei Z., Ma X., Zhang M. A Multi-Modal Transformer-based Code Summarization Approach for Smart Contracts // The 2021 International Conference on Program Comprehension. 2021. pp. 1-12.
- Juraska J., Walker M. Attention Is Indeed All You Need: Semantically Attention-Guided Decoding for Data-to-Text NLG // Proceedings of the 14th International Conference on Natural Language Generation. 2021. pp. 416-431.
- Lewis M., Liu Y., Goyal N., Ghazvininejad M., Mohamed A., Levy O., Stoyanov V., Zettlemoyer L. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension // Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020. pp. 7871-7880.
- Raffel C., Shazeer N., Roberts A., Lee K., Narang S., Matena M., Zhou Y., Li W., Liu P.J. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer // Journal of Machine Learning Research, Volume 21. 2020. pp .1-67.
- Zhang J., Zhao Y., Saleh M., Liu P. J. PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization // Proceedings of the 37th International Conference on Machine Learning. 2020. pp. 11328-11339.
- Qi W., Yan Y., Gong Y., Liu D., Duan N., Chen J., Zhang R., Zhou M. ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training // Findings of the Association for Computational Linguistics: EMNLP 2020. 2020. pp. 2401–2410.
- Devlin J., Chang M., Lee K., Toutanova K. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding // Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019. pp. 4171–4186.
- Lan Z., Chen M., Goodman S., Gimpel K., Sharma P., Soricut R. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations. ArXiv. 2019. URL: https://arxiv.org/abs/1909.11942 (дата обращения: 12.11.2021).
- Liu Y., Ott M., Goyal N., Du J., Joshi M., Chen D., Levy O., Lewis M., Zettlemoyer L., Stoyanov V. RoBERTa: A Robustly Optimized BERT Pretraining Approach. ArXiv. 2019. URL: https://arxiv.org/abs/1907.11692 (дата обращения: 12.11.2021).
- Brow T. B., Mann B., Ryder N., Subbiah M., Kaplan J., Dhariwal P., Neelakantan A., Shyam P., Sastry G., Askell A., Agarwal S., Herbert-Voss A., Krueger G., Henighan T., Child R., Ramesh A., Ziegler D. M., Wu J., Winter C., Hesse C., Chen M., Sigler E., Litwin M., Gray S., Chess B., Clark J., Berner C., McCandlish S., Radford A., Sutskever I., Amodei D. Language Models are Few-Shot Learners // Advances in Neural Information Processing Systems 33 (NeurIPS 2020). 2020. pp. 1877-1901.
- Keskar N.S., McCann B., Varshney L. R., Xiong C., Socher R. CTRL: A Conditional Transformer Language Model for Controllable Generation. ArXiv. 2019. URL: https://arxiv.org/abs/1909.05858 (дата обращения: 12.11.2021).
- Dai Z., Yang Z., Yang Y., Carbonell J., Le Q.V., Salakhutdinov R. Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context // Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019. pp. 2978-2988.
- Hahn S., Choi H. Self-Knowledge Distillation in Natural Language Processing // Proceedings of the International Conference on Recent Advances in Natural Language Processing, Varna, Bulgaria, September 2-4, 2019. 2019. pp. 423-430.
- Sanh V., Debut L., Chaumond J., Wolf T. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. ArXiv. 2019. URL: https://arxiv.org/abs/1910.01108 (дата обращения: 12.11.2021).
- Rajpurkar P., Zhang J., Lopyrev K., Liang P. SQuAD: 100,000+ Questions for Machine Comprehension of Text // Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016. pp. 2383–2392.
- Efimov P., Chertok A., Boytsov L., Braslavski P. SberQuAD - Russian Reading Comprehension Dataset: Description and Analysis // Experimental IR Meets Multilinguality, Multimodality, and Interaction - 11th International Conference of the CLEF Association, CLEF 2020, Thessaloniki, Greece, September 22-25, 2020, Proceedings. 2020. pp. 3-15.
- Sennrich R., Haddow B., Birch A. Neural Machine Translation of Rare Words with Subword Units // Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016. pp. 1715–1725.
- Provilkov I., Emelianenko D., Voita E. BPE-Dropout: Simple and Effective Subword Regularization. ArXiv. 2020 // Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020. pp. 1882–1892.
- Schuster M., Nakajima K. Japanese and Korea voice search // 2012 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2012, Kyoto, Japan, March 25-30, 2012. 2012. pp. 5149-5152.
- Mikolov T., Chen K., Corrado G., Dean J. Efficient Estimation of Word Representations in Vector Space. ArXiv. 2013. URL: https://arxiv.org/pdf/1301.3781.pdf (дата обращения: 12.11.2021).
- Фат Х.Н., Ань Н.Т.М. Алгоритм классификации вьетнамского текста с использованием долгой краткосрочной памяти и Word2Vec // Информатика и автоматизация. 2020. № 6 (19). С. 1255-1279.
- Алтаф С., Iqbal S., Soomro M.W. Эффективный алгоритм классификации естественного языка обнаружения повторяющихся контролируемых признаков // Информатика и автоматизация. 2021. № 3 (20). С. 623-653.
- Lei Ba J., Kiros J.R., Hinton G.E. Layer Normalization. ArXiv. 2016. URL: https://arxiv.org/pdf/1607.06450.pdf (дата обращения: 12.11.2021).
- URL: https://github.com/jessevig/bertviz (дата обращения: 12.11.2021).
- Abdaoui A., Pradel C., Sigel G. Load What You Need: Smaller Versions of Multilingual BERT. ArXiv. 2020. URL: https://arxiv.org/abs/2010.05609 (дата обращения: 12.11.2021).
- URL: https://colab.research.google.com/notebooks/welcome.ipynb?hl=ru (дата обращения: 12.11.2021).
- Li S., Li R., Peng V. Ensemble ALBERT on SQuAD 2.0. ArXiv. 2021. URL: https://arxiv.org/abs/2110.09665 (дата обращения: 12.11.2021).
Supplementary files
