Neuromorphic Systems: Devices, Architecture, and Algorithms

Cover Page

Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

The application of the structure and principles of the human brain opens up great opportunities for creating artificial systems based on silicon technology. The energy efficiency and performance of a biosimilar architecture can be significantly higher compared to the traditional von Neumann architecture. This paper presents an overview of the most promising artificial neural network (ANN) and spiking neural network (SNN) architectures for biosimilar systems, called neuromorphic systems. Devices for biosimilar systems, such as memristors and ferroelectric transistors, are considered for use as artificial synapses that determine the possibility of creating various architectures of neuromorphic systems; methods and rules for training structures to work correctly when mimicking biological learning rules, such as long-term synaptic plasticity. Problems hindering the implementation of biosimilar systems and examples of architectures that have been practically implemented are discussed.

About the authors

K. A. Fetisenkova

Valiev Institute of Physics and Technology, Russian Academy of Sciences; Moscow Institute of Physics and Technology (State University)

Email: fetisenkova@ftian.ru
Moscow, 117218 Russia; Dolgoprudny, Moscow Oblast, 141701 Russia

A. E. Rogozhin

Valiev Institute of Physics and Technology, Russian Academy of Sciences

Author for correspondence.
Email: rogozhin@ftian.ru
Moscow, 117218 Russia

References

  1. Shipley C., Jodis S. in Encyclopedia of Information Systems, 2003
  2. Принципы фон Неймана (Архитектура фон Неймана) // Планета Информатики URL: https://inf1.info/machineneumann (дата обращения: 26.05.2022).
  3. Симонов Н.А. Концепция пятен для задач искусственного интеллекта и алгоритмов нейроморфных систем // Микроэлектроника, 2020. Т. 49. № 6. С. 459–473.
  4. Jeong D.S., Kim K.M., Kim S., Choi B.J., Hwang C.S. Memristors for Energy-Efficient New Computing Paradigms // Adv. Elect. Mater. 2, 1600090 (2016).
  5. Mead C. Neuromorphic electronic systems // Proc. IEEE 78, 1990, 1629–1636.
  6. Ivanov D., Chezhegov A., Kiselev M. Neuromorphic artificial intelligence systems // Frontiers in Neuroscience. 2022.
  7. Sung C., Hwang H., Yoo I.K. Perspective: A review on memristive hardware for neuromorphic computation // J. of Appl. Phys. 2018. № 124.
  8. Feldman D.E. Te spike timing dependence of plasticity // Neuron, 2012, 75, 556–571.
  9. Gjorgjieva J., Clopath C., Audet J., Pfster J.P. A triplet spike-timing-dependent plasticity model generalizes the BienenstockCooper-Munro rule to higher-order spatiotemporal correlations // Proc. Natl.
  10. Ракитин В.В., Русаков С.Г. Мемристорный генератор последовательности импульсов // Микроэлектроника, 2019. Т. 48. № 4. С. 300–307.
  11. Diehl P.U., Neil D, Binas J., Cook M., Liu S.-C., Pfeiffer M. International Joint Conference on Neural Networks (IJCNN) (Killarney, Ireland), 2015. P. 1–8.
  12. Sengupta A., Ye Y., Wang R., Liu C., Roy K. Front. Neurosci. 2019, 13, 95.
  13. de S. Dias C., Butzen P.F. Memristors: A Journey from Material Engineering to Beyond Von-Neumann Computing // J. of Integrated Circuits and Systems, 2021. V. 16. № 1. P. 1–15.
  14. Sun Z., Ambrosi E., Bricalli A., Ielmini D. Logic Computing with Stateful Neural Networks of Resistive Switches // Adv. Mater. 2018, V. 30(38), 1802554.
  15. Borghetti J., Snider G.S., Kuekes P.J. ‘Memristive’ switches enable “stateful” logic operations via material implication // Nature . 2010. № 464. P. 873–876.
  16. Talati N., Gupta S., Mane P., Kvatinsky S. Logic Design Within Memristive Memories Using Memristor-Aided loGIC (MAGIC) // IEEE. 2016. № 15. P. 635–650.
  17. Kvatinsky S. et al. MAGIC—Memristor-aided logic // IEEE Transactions on Circuits and Systems II: Express Briefs, 2014, V. 61(11). P. 895–899.
  18. Trepel M. Neuranatomie: Struktur und Funktion, 5. Auflage, Urban & Fischer, Munchen, 2012.
  19. Глаголев С.М. Как работает нейрон: учебное пособие. Москва: Московская гимназия на Юго-западе, 1993. 63 с.
  20. Дубынин В.А. Мозг: как он устроен и работает: Конспект лекций. Москва: МГУ, 2018. 183 с.
  21. Гладков А.А. Динамика вызванной активности нейронной сети культуры диссоциированных клеток гиппокампа мышей при электрической стимуляции: дис. канд. биол. наук: 03.03.01, Нижний Новгород, 2018. 146 с.
  22. Гафаров Ф.М. Искусственные нейронные сети и приложения: учеб. пособие / Ф.М. Гафаров, А.Ф. Галимянов. Казань: Изд-во Казан. ун-та, 2018. 121 с.
  23. Тарик Р. Создаем нейронную сеть. 1-е изд. Москва: Вильямс, 2018. 272 с.
  24. Lanza M., Wong P., Pop E. Recommended Methods to Study Resistive Switching Devices: WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim, 2018.
  25. Пермякова О.О., Рогожин А.Е. Моделирование резистивного переключения в мемристорных структурах на основе оксидов переходных металлов // Микроэлектроника, 2020. Т. 49. № 5. С. 323–333.
  26. Valov I., Waser R., Jameson J.R., Kozicki M.N. Electrochemical metallization memories—fundamentals, applications, prospects // Nanotechnology. 2011. № 22. P. 24.
  27. Lee J., Lu W.D. On-demand reconfiguration of nanomaterials: When electronics meets ionics // Adv. Mater., 2018. № 30.
  28. Pan F., Gao S., Chen C., Song C., Zeng F. Recent progress in resistive random access memories: Materials, switching mechanisms, and performance // Mater. Sci. Eng., 2014. № 83. P. 1–59.
  29. Noe P., Vallee C., Hippert F., Fillot F., Raty J.-Y. Phase-change materials for non-volatile memory devices: From technological challenges to materials science issues // Semicond. Sci. Technol. 2018. № 33.
  30. Тулина Н.А., Иванов А.А., Россоленко А.Н. Резистивные переключения в мезоскопических гетероструктурах на основе эпитаксиальных пленок Nd2 - xCexCuO4 –y // Микроэлектроника. 2017. Т. 46. № 3. С. 197–202.
  31. Eryilmaz S.B., Kuzum D., Jeyasingh R., Kim S., Brightsky M., Lam C., Wong H.-S.P. Brain-like associative learning using a nanoscale non-volatile phase change synaptic device array // Front. Neurosci. 2014. № 8.
  32. He H.-K., Yang R., Zhou W., Huang H.-M., Xiong J., Gan L., Zhai T.-Y., Guo X. Photonic potentiation and electric habituation in ultrathin memristive synapses based on monolayer MoS2 // Small. 2018. № 14.
  33. Kim D., Lu H., Ryu S., Bark C.-W., Eom C.-B., Tsymbal E., Gruverman A. Ferroelectric tunnel memristor // Nano Lett. 2012. № 12. P. 5697–5702.
  34. Kim H.J., Baek Y.J., Choi Y.J., Kang C.J., Lee H.H., Kim H.M., Kim K.B., Yoon T.S. // Rsc Advances. 2013. № 3.
  35. Seo K., Kim I., Jung S., Jo M., Park S., Park J., Shin J., Biju K. P., Kong J., Lee K., Lee B., Hwang H. // Nanotechnology. 2011. № 22.
  36. Borghetti J., Snider G.S., Kuekes P.J., Yang J.J., Stewart D.R., Williams R.S. ‘Memristive’ switches enable “stateful” logic operations via material implication // Nature, 2010. V.464(7290). P. 873–876.
  37. Xu N., Park T., Yoon K.-J. In-Memory Stateful Logic Computing Using Memristors: Gate, Calculation, and Application // Phys. Status Solidi RRL 2021, 2100208.
  38. Wang Z. et al. Resistive switching materials for information processing // Nature Reviews Materials, 2020. P. 1–23.
  39. Wright C.D., Hosseini P., Diosdado J.A.V. Beyond von-Neumann computing with nanoscale phase-change memory devices // Advanced Functional Materials 2013. V. 23(18). P. 2248–2254.
  40. Hu M., Li H., Wu Q., Rose G.S. Hardware realization of BSB recall function using memristor crossbar arrays // in DAC Design Automation Conference. 2012. P. 498–503.
  41. Oh S., Hwang H., Yoo I.K. Ferroelectric materials for neuromorphic computing // APL Materials, 2019. № 7. P. 091109.
  42. Jerry M., Chen P.-Y., Zhang J., Sharma P., Ni K., Yu S., Datta S. Ferroelectric FET analog synapse for acceleration of deep neural network training, in International Electron Devices Meeting (IEDM), San Francisco, CA, 2–6 December 2017.
  43. Saxena V. Neuromorphic computing: From devices to integrated circuits // J. Vac. Sci. Technol. 2021. № 39. P. 21.
  44. Zhang Y., Wang Z., Zhu J. Brain-inspired computing with memristors: Challenges in devices, circuits, and systems // Appl. Phys. Rev. 2020. № 7. P. 24.
  45. Ahmed T., Walia S., Mayes E.L.H. Time and rate dependent synaptic learning in neuro-mimicking resistive memories // Scientific Reports. 2019. № 9. P. 11.
  46. Sathya R., Abraham A. Comparison of supervised and unsupervised learning algorithms for pattern classification // International J. of Advanced Research in Artificial Intelligence, 2013. V. 2(2). P. 34–38.
  47. Kuzum D., Yu S., Wong H.P. Synaptic electronics: materials, devices and applications // Nanotechnology, 2013. V. 24(38). P. 382001.
  48. Wu X., Saxena V., Zhu K. A CMOS spiking neuron for dense memristor-synapse connectivity for brain-inspired computing // in 2015 International Joint Conference on Neural Networks (IJCNN), 2015. P. 1–6.
  49. Ielmini D., Wang Z., Liu Y. Brain-inspired computing via memory device physics // APL Materials. 2021. № 9. 050702.
  50. Hu S.G., Wu S.Y., Jia W.W. Review of Nanostructured Resistive Switching Memristor and Its Applications // Nanoscience and Nanotechnology Letters. 2014. № 6. P. 729–757.
  51. Loihi – Intel // WikiChip URL: https://en.wikichip. org/wiki/intel/loihi (дата обращения: 26.05.2022).
  52. Davies M., Srinivasa N., Lin T.-H. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning // IEEE Computer Society 0272-1732/18. 2018.
  53. Blouw P., Choo X., Hunsberger E., Eliasmith C. Benchmarking Keyword Spotting Efficiency on Neuromorphic Hardware // Applied Brain Research, Inc. Waterloo, ON, Canada. 2019.
  54. Dharmendra S. Modha, TrueNorth: Accelerating From Zero to 64 Million Neurons in 10 Years. IEEE Computer, May 2019, IEEE Computer Society.
  55. Merolla P.A., Arthur J.V., Alvarez-Icaza R., Cassidy A.S., Sawada J., Akopyan F. et al. A million spiking-neuron integrated circuit with a scalable communication network and interface // Science, 2014. 345. P. 668–673.
  56. Ankit A., Ndu G., Rahul S. Chalamalasetti PUMA: A Programmable Ultra-efficient Memristor-based Accelerator for Machine Learning Inference // Association for Computing Machinery. 2019.
  57. Baischer L., Wess M., TaheriNejad N. Learning on Hardware: A Tutorial on Neural Network Accelerators and Co-Processors // 2021.
  58. Huang X., Liu C., Jiang Y.-G., Zhou P. In-memory computing to break the memory wall // Chin. Phys. B. 2020. V. 29(7). P. 078504.
  59. Shi L., Zeng G., Tian B. Research progress on solutions to the sneak path issue in memristor crossbar arrays // Nanoscale Advances. 2020. № 2. P. 1811.
  60. Huang C.H., Choi T.S., Huang J.S., Lin S.M., Chueh Y.L. Self-Selecting Resistive Switching Scheme Using TiO2 Nanorod Arrays // Scientific Reports. 2017. № 7. P. 2066.
  61. Geim A.K., Novoselov K.S. The rise of graphene Co-Published with Macmillan Publishers Ltd, UK 11-19, 2009.
  62. Yao P., Wu H., Gao B. et al. Fully hardware-implemented memristor convolutional neural network // Nature, 2020. V. 577. P. 641–646.
  63. Cai F., Correll J.M., Lee S.H. et al. A fully integrated reprogrammable memristor–CMOS system for efficient multiply–accumulate operations // Nat Electron, 2019. V. 2. P. 290–299.

Supplementary files


Copyright (c) 2023 К.А. Фетисенкова, А.Е. Рогожин

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies