The Conceptual Modeling System Based on Metagraph Approach

Cover Page

Cite item

Full Text

Abstract

The article is devoted to an approach to building a conceptual modeling system, which includes text recognition in a conceptual structure and text generation based on a conceptual structure. The metagraph is used as a conceptual structure. The architecture of the conceptual modeling system is proposed. The metagraph model is considered as a data model for conceptual modeling. The main ideas of the work of the text parsing module and text generation module are considered.

About the authors

N. D. Todosiev

Bauman Moscow State Technical University

Author for correspondence.
Email: todosievnik@gmail.com

Graduate student

Russian Federation, Moscow

V. I. Yankovsky

Bauman Moscow State Technical University

Email: lucker1005000@gmail.com

Graduate student

Russian Federation, Moscow

Yu. E. Gapanyuk

Bauman Moscow State Technical University

Email: gapyu@bmstu.ru

Associate professor, Federal state budgetary institution of higher professional education

Russian Federation, Moscow

A. M. Andreev

Bauman Moscow State Technical University

Email: arkandreev@gmail.com

Associate professor, Federal state budgetary institution of higher professional education

Russian Federation, Moscow

References

  1. The XMind homepage. Available at: https://www. xmind.net/ (accessed August 30, 2022)
  2. Baker, C.F., C.J. Fillmore and J.B. Lowe. 1998. The Berkeley FrameNet Project, In Proceedings of the 36th Annual Meeting of the Association for Computational Linguistics and 17th International Conference on Computational Linguistics Volume 1, Association for Computational Linguistics, pp: 86–90.
  3. Buzan, T. 2018. Mind Map Mastery: The Complete Guide to Learning and Using the Most Powerful Thinking Tool in the Universe. Watkins Media.
  4. Cui, B., Y. Li, M. Chen and Z. Zhang. 2018. Deep attentive sentence ordering network, In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, , pp: 4340–4349.
  5. Flesch, R. 1948. A new readability yardstick. The Journal of Applied Psychology, 32(3): 221–233.
  6. Ishikawa, K. 1986. Guide to Quality Control. Asian Productivity Organization.
  7. Lyashevskaya, O.N. and J.L. Kuznetsova. 2009. Russian FrameNet: constructing a corpus--based dictionary of constructions [Russkij Frejmnet: k zadache sozdanija korpusnogo slovarja konstruktsij], In Computational Linguistics and Intellectual Technologies: Proceedings of the International Conference “Dialog”[Komp’juternaja Lingvistika I Intelleltual’nye Tehnologii: Po Materialam Ezhegodnoj Mezhdunarodnoj Konferentsii “Dialog”], , pp: 306–312.
  8. Lyashevskaya, O. and E. Kashkin. 2015. FrameBank: A Database of Russian Lexical Constructions, In Analysis of Images, Social Networks and Texts, Springer International Publishing, pp: 350–360.
  9. McCreesh, C., P. Prosser and J. Trimble. 2020. The Glasgow Subgraph Solver: Using Constraint Programming to Tackle Hard Subgraph Isomorphism Problem Variants. Graph Transformation, 316–324.
  10. Novak, J. and A.J. Cañas. 2006. The Origins of the Concept Mapping Tool and the Continuing Evolution of the Tool. Information Visualization, 5(3): 175–184. https://doi.org/10.1057/palgrave.ivs.9500126
  11. Raffel, C., N. Shazeer, A. Roberts, K. Lee, S. Narang, M. Matena, Y. Zhou, W. Li and P.J. Liu. 2019. Exploring the limits of transfer learning with a unified text-to-text transformer. arXiv [cs. LG]. https://doi.org/10.48550/arxiv.1910.10683
  12. Swayamdipta, S., S. Thomson, C. Dyer and N.A. Smith. 2017. Frame-Semantic Parsing with Softmax-Margin Segmental RNNs and a Syntactic Scaffold. arXiv [cs.CL]. https://doi.org/10.48550/ arxiv.1706.09528
  13. Talvitie, V. 2018. The Foundations of Psychoanalytic Theories.
  14. Gapanyuk, Y. 2021. The development of the metagraph data and knowledge model. In Selected Contributions to the 10th International Conference on “Integrated Models and Soft Computing in Artificial Intelligence (IMSC-2021)”. pp: 1–7
  15. Chernenkiy, V.M., Y.E. Gapanyuk, Y. Kaganov and I. Dunin. 2018. Storing Metagraph Model in Relational, Document-Oriented, and Graph Databases. DAMDID/RCDL.
  16. Tarassov, V., Y. Kaganov and Y. Gapanyuk. 2021. The Metagraph Model for Complex Networks: Definition, Calculus, and Granulation Issues, In Artificial Intelligence, Springer International Publishing, pp: 135–151. https://doi.org/10.23919/ FRUCT48808.2020.9087470
  17. Chernenkiy, V., Y. Gapanyuk, A. Nardid and N. Todosiev. 2020. The Implementation of Metagraph Agents Based on Functional Reactive Programming, In 2020 26th Conference of Open Innovations Association (FRUCT), pp: 1–8. https://doi. org/10.23919/FRUCT48808.2020.9087470
  18. Yin, Y., L. Song, J. Su, J. Zeng, C. Zhou and J. Luo. 2019. Graph-based Neural Sentence Ordering. arXiv [cs.CL].
  19. Zhabotynska, S.A. 2010. Principles of building conceptual models for thesaurus dictionaries. Cognition, Communication, Discourse, 1: 75–92.
  20. Ji, H., P. Ke, S. Huang, F. Wei, X. Zhu and M. Huang. 2020. Language Generation with MultiHop Reasoning on Commonsense Knowledge Graph. arXiv [cs.CL]. https://doi.org/10.18653/ v1/2020.emnlp-main.54
  21. Bai, H., P. Shi, J. Lin, Y. Xie, L. Tan, K. Xiong, W. Gao and M. Li. 2021. Segatron: Segment-Aware Transformer for Language Modeling and Understanding. Proceedings of the AAAI Conference on Artificial Intelligence, 35(14): 12526–12534.
  22. Wu, G., W. Wu, L. Li, G. Zhao, D. Han and B. Qiao. 2020. BCRL: Long Text Friendly Knowledge Graph Representation Learning, In The Semantic Web – ISWC 2020, Springer International Publishing, pp: 636–653.
  23. Rajpurkar, P., J. Zhang, K. Lopyrev and P. Liang. 2016. SQuAD: 100,000+ Questions for Machine Comprehension of Text. arXiv [cs.CL]. https:// doi.org/10.18653/v1/D16-1264
  24. Bordes, A., N. Usunier and A. Garcia-Duran. 2013. Translating embeddings for modeling multi-relational data. Advances in Neural Information Processing Systems.
  25. Radford, A., J. Wu, R. Child, D. Luan, D. Amodei and I. Sutskever. 2019. Language models are unsupervised multitask learners. OpenAI Blog, 1(8): 9.
  26. Devlin, J., M.-W. Chang, K. Lee and K. Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv [cs.CL].

Supplementary files

Supplementary Files
Action
1. JATS XML

Согласие на обработку персональных данных

 

Используя сайт https://journals.rcsi.science, я (далее – «Пользователь» или «Субъект персональных данных») даю согласие на обработку персональных данных на этом сайте (текст Согласия) и на обработку персональных данных с помощью сервиса «Яндекс.Метрика» (текст Согласия).