Definition of information in computer science

Cover Page

Cite item

Full Text

Abstract

Purpose of this study is to formulate a working definition of information to meet the needs of computer science. There is currently no strict definition of this term. There is a methodological contradiction: the development and application of information technologies requires accuracy and rigor, but at the same time the development is based on a vague, intuitive concept. Materials and methods. The materials for the study are existing classical approaches to understanding information, and the main method is the analysis of these approaches. The proposed definition is constructed taking into account two mathematical transformations: the selection of a certain subset and the mapping between sets. To formalize the allocation procedure, it is used apparatus of fuzzy sets. Results. A definition of information is proposed as the result of a mapping in which the selection of a subset from a set of prototypes leads to the selection of a corresponding subset from a set of images. The selected subset can be understood as fuzzy, then an equivalent definition of information is acceptable as a result of mapping, in which an increase in the heterogeneity of the distribution of the presence indicator on the set of prototypes leads to an increase in the heterogeneity of the distribution of the corresponding indicator on the set of images. The essence of the new definition is demonstrated using models of population dynamics in discrete time. The significance of the proposed approach for information technology is revealed using the example of the numerical method of multi-extremal optimization. It is shown that the proposed definition makes it possible to formulate effective stopping conditions for the numerical method of stochastic optimization, which guarantees the receipt of a given amount of information. Conclusion. The proposed understanding of information allows us to overcome the shortcomings of previous approaches to understanding the essence of information, retains all the advantages of the classical approach and is consistent with other well-known approaches in the field of computer science. This definition can be used to improve numerical optimization methods, as well as other information technology tools.

About the authors

Oleg Anatolevich Kuzenkov

Lobachevsky State University of Nizhny Novgorod

ORCID iD: 0000-0001-9407-0517
Scopus Author ID: 6508182979
ResearcherId: G-9720-2017
603950 Nizhny Novgorod, Gagarin Avenue, 23

References

  1. Kosta A., Pappas N., Angelakis V. Age of Information: A New Concept, Metric, and Tool // Foundations and Trends in Networking. 2017. Vol. 12, no. 3. P. 162–259. doi: 10.1561/1300000060.
  2. Maatouk A., Kriouile S., Assaad M., Ephremides A. The Age of Incorrect Information: A New Performance Metric for Status Updates // IEEE/ACM Transactions on Networking. 2020. Vol. 28, no. 5. P. 2215–2228. doi: 10.1109/TNET.2020.3005549.
  3. Leinster T. Entropy and diversity: the axiomatic approach. New York: Cambridge University Press, 2021. 442 p. doi: 10.48550/arXiv.2012.02113.
  4. Мазур М. Качественная теория информации. М.: Мир, 1974. 238 с.
  5. Колмогоров А. Н. Комбинаторные основания теории информации и исчисления вероятностей // Успехи математических наук. 1983. Т. 38, № 4. С. 27–36. doi: 10.1070/rm1983v038 n04abeh004203.
  6. Чернавский Д. С. Синергетика и информация: Динамическая теория информации. М.: Наука, 2001. 304 с.
  7. Bates M. J. Concepts for the Study of Information Embodiment // Library Trends. 2018. Vol. 66, no. 3. P. 239–266. doi: 10.1353/lib.2018.0002.
  8. Adriaans P. A Critical Analysis of Floridi’s Theory of Semantic Information // Knowledge, Technology & Policy. 2010. Vol. 23. P. 41–56. doi: 10.1007/s12130-010-9097-5.
  9. Floridi L. What is the philosophy of information? // Metaphilosophy. 2002. Vol. 33, no. 1–2. P. 123–145. doi: 10.1111/1467-9973.00221.
  10. Ган Л. Философия информации и основы будущей китайской философии науки и техники // Вопросы философии. 2007. № 5. С. 45–57.
  11. Adriaans P., van Benthem J. Philosophy of information (Handbook of the philosophy of science). North Holland, 2008. 1000 p.
  12. Колин К. К. Философия информации: структура реальности и феномен информации // Метафизика. 2013. Т. 4(10). C. 61–84.
  13. Sequoiah-Grayson S. The Metaphilosophy of Information // Minds and Machines. 2007. Vol. 17. P. 331–344. doi: 10.1007/s11023-007-9072-4.
  14. Mingers J., Standing C. What is information? Toward a theory of information as objective and veridical // Journal of Information Technology. 2018. Vol. 33, no. 2. P. 85–104. DOI: 10.1057/ s41265-017-0038-6.
  15. Diaz Nafria J. What is information? A Multidimensional Concern // TripleC. 2010. Vol. 8, no. 1. P. 77–108. doi: 10.31269/triplec.v8i1.76.
  16. Crnkovic G., Hofkirchner W. Floridi’s “Open Problems in Philosophy of Information”, Ten Years Later // Information. 2011. Vol. 2, no. 2. P. 327–359. doi: 10.3390/info2020327.
  17. Robinson L., Bawden D. Mind the Gap: Transitions between concepts of information in varied domains //Theories of Information, Communication and Knowledge. 2014. Vol. 34. P. 121–141. doi: 10.1007/978-94-007-6973-1_6.
  18. Лекторский В. А., Пружинин Б. И., Бодякин В. И., Дубровский Д. И., Колин К. К., Мелик-Гайказян И. В., Урсул А. Д. Информационный подход в междисциплинарной перспективе (материалы «круглого стола») // Вопросы философии. 2010. № 2. С. 84–122.
  19. Zins C. Conceptual Approaches to Defining Data, Information and Knowledge // Journal of the American Society for Information Science and Technology. 2007. Vol. 58, no. 4. P. 479–493. doi: 10.1002/asi.20508.
  20. Liew A. Understanding Data, Information, Knowledge And Their Inter-Relationships // Journal of Knowledge Management Practice. 2007. Vol. 8, no. 2.
  21. Capurro R., Hjorland B. The Concept of Information // Annual Review of Information Science and Technology. 2003. Vol. 37, no. 1. P. 343–411. doi: 10.1002/aris.1440370109.
  22. Beynon-Davies P. Significance: Exploring the nature of information, systems and technology. London: Palgrave Macmillan, 2010. 355 p.
  23. Callaos N., Callaos B. Toward a Systemic Notion of Information: Practical Consequences // Informing Science. 2002. Vol. 5, no. 1. P. 1–11. doi: 10.28945/532.
  24. Vigo R. Representational information: a new general notion and measure of information // Information Sciences. 2011. Vol. 181, no. 21. P. 4847–4859. doi: 10.1016/j.ins.2011.05.020.
  25. Deutsch D., Maretto C. Constructor theory of information // Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences. 2015. Vol. 471, no. 2174. P. 20140540. doi: 10.1098/rspa.2014.0540.
  26. Dittrich T. “The concept of information in physics”: an interdisciplinary topical lecture // European Journal of Physics. 2015. Vol. 36, no. 1. P. 015010. doi: 10.1088/0143-0807/36/1/015010.
  27. Clifton R., Bub J., Halvorson H. Characterizing quantum theory in terms of information-theoretic constraints // Foundations of Physics. 2003. Vol. 33. P. 1561–1591. doi: 10.1023/A:1026056716397.
  28. Morrison M. L., Rosenberg N. A. Mathematical bounds on Shannon entropy given the abundance of the ith most abundant taxon // Journal of Mathematical Biology. 2023. Vol. 87. P. 76. doi: 10.1007/s00285-023-01997-3.
  29. Cushman S. A. Entropy in landscape ecology: a quantitative textual multivariate review // Entropy. 2021. Vol. 23, no. 11. P. 1425. doi: 10.3390/e23111425.
  30. Беляев М. А., Малинина Л. А., Лысенко В. В. Основы информатики: Учебник для вузов. М.: Феникс, 2006. 352 с.
  31. Симонович С. В. Информатика. Базовый курс: Учебник для вузов. 3-е изд. Стандарт третьего поколения. СПб.: Питер, 2011. 640 с.
  32. Макарова Н. В., Волков В. Б. Информатика: Учебник для вузов. СПб.: Питер, 2011. 576 с.
  33. Nielsen M., Chuang I. Quantum Computation and Quantum Information. Cambridge: Cambridge University Press, 2000. 702 p. doi: 10.1017/CBO9780511976667.
  34. Заде Л. А. Основы нового подхода к анализу сложных систем и процессов принятия решений // Математика сегодня. 1974. C.5–49.
  35. Заде Л. А. Нечеткие множества // Нечеткие системы и мягкие вычисления. 2015. Т. 10, № 1. С. 7–22.
  36. Kuzenkov O., Morozov A. Towards the Construction of a Mathematically Rigorous Framework for the Modelling of Evolutionary Fitness // Bulletin of Mathematical Biology. 2019. Vol. 81, no. 11. P. 4675–4700. doi: 10.1007/s11538-019-00602-3.
  37. Перфильева Е. Г. Приложения теории нечетких множеств // Итоги науки и техники. Серия «Теория вероятностей. Математическая статистика. Теоретическая кибернетика». 1990. Т. 28. С. 83–151.
  38. Kuzenkov O., Ryabova E. Variational Principle for Self-replicating Systems // Mathematical Modelling of Natural Phenomena. 2015. Vol. 10, no. 2. P. 115–128. doi: 10.1051/mmnp/201510208.
  39. Kuzenkov O. A., Novozhenin A. V. Optimal control of measure dynamic // Communications in Nonlinear Science and Numerical Simulation. 2015. Vol. 21, no.1-3. P. 159–171. DOI: 10.1016/ j.cnsns.2014.08.024.
  40. Sandhu S., Morozov A., Kuzenkov O. Revealing Evolutionarily Optimal Strategies in Self-Reproducing Systems via a New Computational Approach // Bulletin of Mathematical Biology 2019. Vol. 81, no. 11. P. 4701–4725. doi: 10.1007/s11538-019-00663-4.
  41. Muller I. A History of Thermodynamics: The Doctrine of Energy and Entropy. Berlin: Springer, 2007. 330 p. doi: 10.1007/978-3-540-46227-9.
  42. Shu J. J. A new integrated symmetrical table for genetic codes // Biosystems. 2017. Vol. 151. P. 21–26. doi: 10.1016/j.biosystems.2016.11.004.
  43. Morozov A. Y., Kuzenkov O. A., Sandhu S. K. Global optimisation in hilbert spaces using the survival of the fittest algorithm // Communications in Nonlinear Science and Numerical Simulation. 2021. Vol. 103. P. 106007. doi: 10.1016/j.cnsns.2021.106007.
  44. Yao X., Liu Y., Lin G. Evolutionary programming made faster // IEEE Transactions on Evolutionary Computation. 1999. Vol. 3, no. 2. P. 82–102. doi: 10.1109/4235.771163.
  45. Kuzenkov O., Morozov A., Kuzenkova G. Recognition of patterns of optimal diel vertical migration of zooplankton using neural networks // IJCNN 2019 — International Joint Conference on Neural Networks, Budapest. Hungary. 2019. P. 1–6. doi: 10.1109/IJCNN.2019.8852060.
  46. Kuzenkov O., Kuzenkova G. Identification of the fitness function using neural networks // Procedia Computer Science. 2020. Vol. 169. P. 692–697. doi: 10.1016/j.procs.2020.02.179.
  47. Casagrande D. Information as verb: Re-conceptualizing information for cognitive and ecological models // Journal of Ecological Anthropology. 1999. Vol. 3, no. 1. P. 4–13. doi: 10.5038/2162- 4593.3.1.1.
  48. Dusenbery D. B. Sensory Ecology. New York: Freeman, 1992. 558 p.
  49. Заде Л. А. Понятие лингвистической переменной и его применение к принятию приближенных решений. М.: Мир, 1976. 167 с.
  50. Кузенков О. А., Кузенкова Г. В., Киселева Т. П. Компьютерная поддержка учебно-исследовательских проектов в области математического моделирования процессов отбора // Образовательные технологии и общество. 2019. Т. 22, № 1. С. 152–163.
  51. Кузенков О. А. Изучение концепции информации студентами ИТ-направлений // Современные информационные технологии и ИТ-образование. 2023. T. 19, № 1. С. 13–23. doi: 10.25559/SITITO.019.202301.013-023

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies