<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE root>
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ali="http://www.niso.org/schemas/ali/1.0/" article-type="research-article" dtd-version="1.2" xml:lang="en"><front><journal-meta><journal-id journal-id-type="publisher-id">ARTIFICIAL INTELLIGENCE AND DECISION MAKING</journal-id><journal-title-group><journal-title xml:lang="en">ARTIFICIAL INTELLIGENCE AND DECISION MAKING</journal-title><trans-title-group xml:lang="ru"><trans-title>Искусственный интеллект и принятие решений</trans-title></trans-title-group></journal-title-group><issn publication-format="print">2071-8594</issn></journal-meta><article-meta><article-id pub-id-type="publisher-id">269436</article-id><article-id pub-id-type="doi">10.14357/20718594230207</article-id><article-categories><subj-group subj-group-type="toc-heading" xml:lang="en"><subject>Machine Learning, Neural Networks</subject></subj-group><subj-group subj-group-type="toc-heading" xml:lang="ru"><subject>Машинное обучение, нейронные сети</subject></subj-group><subj-group subj-group-type="article-type"><subject>Research Article</subject></subj-group></article-categories><title-group><article-title xml:lang="en">Parabola as an Activation Function of Artificial Neural Networks</article-title><trans-title-group xml:lang="ru"><trans-title>Парабола как функция активации искусственных нейронных сетей</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author"><name-alternatives><name xml:lang="en"><surname>Khachumov</surname><given-names>Mikhail V.</given-names></name><name xml:lang="ru"><surname>Хачумов</surname><given-names>Михаил Вячеславович</given-names></name></name-alternatives><address><country country="RU">Russian Federation</country></address><bio xml:lang="en"><p>Candidate of Physical and Mathematical Sciences, Senior Researcher; Senior Researcher; Associate Professor</p></bio><bio xml:lang="ru"><p>кандидат физико-математических наук, старший научный сотрудник; старший научный сотрудник; доцент</p></bio><email>khmike@inbox.ru</email><xref ref-type="aff" rid="aff1"/><xref ref-type="aff" rid="aff2"/><xref ref-type="aff" rid="aff3"/></contrib><contrib contrib-type="author"><name-alternatives><name xml:lang="en"><surname>Emelyanova</surname><given-names>Yulia G.</given-names></name><name xml:lang="ru"><surname>Емельянова</surname><given-names>Юлия Геннадиевна</given-names></name></name-alternatives><address><country country="RU">Russian Federation</country></address><bio xml:lang="en"><p>Candidate of Technical Sciences, Researcher</p></bio><bio xml:lang="ru"><p>кандидат технических наук, научный сотрудник</p></bio><email>yuliya.emelyanowa2015@yandex.ru</email><xref ref-type="aff" rid="aff1"/></contrib></contrib-group><aff-alternatives id="aff1"><aff><institution xml:lang="en">Ailamazyan Program Systems Institute of the Russian Academy of Sciences</institution></aff><aff><institution xml:lang="ru">Институт программных систем им. А. К. Айламазяна РАН</institution></aff></aff-alternatives><aff-alternatives id="aff2"><aff><institution xml:lang="en">Computer Science and Control Federal Research Center of the Russian Academy of Sciences</institution></aff><aff><institution xml:lang="ru">Федеральный исследовательский центр «Информатика и управление» РАН</institution></aff></aff-alternatives><aff-alternatives id="aff3"><aff><institution xml:lang="en">Peoples' Friendship University of Russia</institution></aff><aff><institution xml:lang="ru">Российский университет дружбы народов</institution></aff></aff-alternatives><pub-date date-type="pub" iso-8601-date="2023-04-15" publication-format="electronic"><day>15</day><month>04</month><year>2023</year></pub-date><issue>2</issue><issue-title xml:lang="en"/><issue-title xml:lang="ru"/><fpage>89</fpage><lpage>97</lpage><history><date date-type="received" iso-8601-date="2024-11-11"><day>11</day><month>11</month><year>2024</year></date><date date-type="accepted" iso-8601-date="2024-11-11"><day>11</day><month>11</month><year>2024</year></date></history><permissions><copyright-statement xml:lang="en">Copyright ©; 2023, ФИЦ ИУ РАН</copyright-statement><copyright-statement xml:lang="ru">Copyright ©; 2023,</copyright-statement><copyright-year>2023</copyright-year><copyright-holder xml:lang="en">ФИЦ ИУ РАН</copyright-holder></permissions><self-uri xlink:href="https://journals.rcsi.science/2071-8594/article/view/269436">https://journals.rcsi.science/2071-8594/article/view/269436</self-uri><abstract xml:lang="en"><p>The use of parabola and its branches as a nonlinearity expanding the logical capabilities of artificial neurons is considered. In particular, the applicability of parabola branches for constructing an s-shaped function suitable for tuning a neural network by reverse error propagation is determined. The implementation of the XOR function on two and three neurons using the proposed approach is demonstrated. The main advantage of the parabola over the sigmoid is a simpler implementation, which speeds up the work of artificial neural networks.</p></abstract><trans-abstract xml:lang="ru"><p>Рассматриваются вопросы применения параболы и ее ветвей как нелинейности, расширяющей логические возможности искусственных нейронов. В частности, обусловлена применимость ветвей параболы для построения s-образной функции активации, пригодной для настройки нейронной сети методом обратного распространения ошибки. Продемонстрирована реализация функции XOR на двух и трех нейронах с применением предложенного подхода. Основное преимущество параболы перед сигмоидом – более простая реализация, что ускоряет работу искусственных нейронных сетей.</p></trans-abstract><kwd-group xml:lang="en"><kwd>sigmoid</kwd><kwd>parabola</kwd><kwd>s-shaped activation function</kwd><kwd>neuron</kwd><kwd>neural network</kwd><kwd>XOR problem</kwd><kwd>tuning rate</kwd></kwd-group><kwd-group xml:lang="ru"><kwd>сигмоид</kwd><kwd>парабола</kwd><kwd>s-образная функция активации</kwd><kwd>нейрон</kwd><kwd>нейронная сеть</kwd><kwd>XOR</kwd><kwd>скорость настройки</kwd></kwd-group><funding-group><funding-statement xml:lang="en">The research was carried out at the expense of Russian Science Foundation Grant No. 21-71-10056</funding-statement><funding-statement xml:lang="ru">Исследование выполнено за счет гранта Российского научного фонда № 21-71-10056</funding-statement></funding-group></article-meta></front><body></body><back><ref-list><ref id="B1"><label>1.</label><citation-alternatives><mixed-citation xml:lang="en">Zakharov A.V., Khachumov V.M. Bit-parallel Representation of Activation Functions for Fast Neural Networks // Proceedings of the 7-th International Conference on Pattern Recognition and Image Analysis. 2014. V. 2. P. 568-571.</mixed-citation><mixed-citation xml:lang="ru">Zakharov A.V., Khachumov V.M. Bit-parallel Representation of Activation Functions for Fast Neural Networks // Proceedings of the 7th International Conference on Pattern Recognition and Image Analysis. 2014. V. 2. P. 568-571.</mixed-citation></citation-alternatives></ref><ref id="B2"><label>2.</label><mixed-citation>Arce F., Zamora E., Humberto S. Barrón, R. Differential evolution training algorithm for dendrite morphological neural networks // Applied Soft Computing. 2018. V. 68. P. 303-313.</mixed-citation></ref><ref id="B3"><label>3.</label><mixed-citation>Dimitriadis N., Maragos, P. Advances in the training, pruning and enforcement of shape constraints of Morphological Neural Networks using Tropical Algebra // IEEE International Conference On Acoustics, Speech And Signal Processing. 2021. P. 3825-3829.</mixed-citation></ref><ref id="B4"><label>4.</label><citation-alternatives><mixed-citation xml:lang="en">Limonova E.E., Nikolaev D.P., Alfonso D.M., Arlazarov V.V. Bipolar Morphological Neural Networks: Gate-Efficient Architecture for Computer Vision // IEEE Access. 2021. V.9. P. 97569-97581.</mixed-citation><mixed-citation xml:lang="ru">Limonova E.E., Nikolaev D.P., Alfonso D.M., Arlazarov V.V. Bipolar Morphological Neural Networks: GateEfficient Architecture for Computer Vision // IEEE Access. 2021. V.9. P. 97569-97581.</mixed-citation></citation-alternatives></ref><ref id="B5"><label>5.</label><mixed-citation>Limonova E.E., Nikolaev D.P., Arlazarov V.V. Bipolar Morphological U-Net for Document Binarization // Thirteenth International Conference on Machine Vision. 2021. P. 1-9.</mixed-citation></ref><ref id="B6"><label>6.</label><citation-alternatives><mixed-citation xml:lang="en">Limonova E.E., Nikolaev D.P, Alfonso D., Arlazarov V.V. ResNet-like Architecture with Low Hardware Requirements // 25th International Conference on Pattern Recognition. 2021. P. 6204-6211.</mixed-citation><mixed-citation xml:lang="ru">Limonova E.E., Nikolaev D.P, Alfonso D., Arlazarov V.V. ResNet-like Architecture with Low Hardware Requirements // 25th International Conference on Pattern Recognition. 2021. P. 6204–6211.</mixed-citation></citation-alternatives></ref><ref id="B7"><label>7.</label><mixed-citation>Limonova E., Matveev D., Nikolaev D., Arlazarov V. Bipolar morphological neural networks: convolution without multiplication // Twelfth International Conference on Machine Vision. 2020. V. 11433. P. 1-18.</mixed-citation></ref><ref id="B8"><label>8.</label><citation-alternatives><mixed-citation xml:lang="en">Khachumov V.M. Logicheskie elementy na neyronah [Logical elements on neurons] // Trudy IX Mezhdunarodnoy Conferentcyi “Intellectualnye systemy i comp’uternye nauki” [Proceedings of the 9th International Conference “Intelligent Systems and Computer Science”]. Moscow, 2006. V. 1. P. 297-300.</mixed-citation><mixed-citation xml:lang="ru">Хачумов В.М. Логические элементы на нейронах // Труды IX международной конференции «Интеллектуальные системы и компьютерные науки». 2006. T.1. Ч.2. C. 297-300.</mixed-citation></citation-alternatives></ref><ref id="B9"><label>9.</label><citation-alternatives><mixed-citation xml:lang="en">Kruflov V.V., Borisov V.V. Iskusstvennye neuronnye seti. Teoriya i practika [Artificial neural networks. Theory and practice]. Moscow: Hotline-Telecom, 2002.</mixed-citation><mixed-citation xml:lang="ru">Круглов В.В., Борисов В. В. Искусственные нейронные сети. Теория и практика. M.: Горячая линияТелеком. 2002.</mixed-citation></citation-alternatives></ref><ref id="B10"><label>10.</label><citation-alternatives><mixed-citation xml:lang="en">Haykin S. Neyronnye seti: Polny’ kurs [Neural Networks: The Full Course]. Мoscow: Williams. 2006.</mixed-citation><mixed-citation xml:lang="ru">Хайкин С. Нейронные сети: полный курс. М.: Вильямс. 2006.</mixed-citation></citation-alternatives></ref><ref id="B11"><label>11.</label><citation-alternatives><mixed-citation xml:lang="en">Callan R. Osnovnye koncepcii neyronnykh setej [The Essence of Neural Networks]. Мoscow: Williams. 2001.</mixed-citation><mixed-citation xml:lang="ru">Каллан Р. Основные концепции нейронных сетей. Пер. с англ. М.: Вильямс. 2001.</mixed-citation></citation-alternatives></ref></ref-list></back></article>
