Software & Systems

Научный журнал "Программные продукты и системы" выпускается с 1988 г.

Главный редактор издания  Геннадий Иванович Савин, академик РАН (Москва, Россия)

Учредитель:  Куприянов В.П.

Издатель: Научно-исследовательский институт «Центрпрограммсистем» (г. Тверь).

ISSN 0236-235Х (печатная версия журнала),

ISSN 2311-2735 (онлайн-версия журнала)

DOI 10.15827/0236-235X

Журнал зарегистрирован в Роскомнадзоре 3 марта 2020 г. Регистрационное свидетельство ПИ № ФС 77 - 77843.

Журнал «Программные продукты и системы» способствует обобщению и распространению опыта разработки и применения вычислительной техники и программных средств, знакомит читателей с работами ученых и специалистов России и зарубежных стран, стремится дать ответы на практические вопросы по использованию вычислительной техники во всех отраслях народного хозяйства, анализирует тенденции и прогнозирует развитие программно-технических средств.

Current Issue

Vol 38, No 1 (2025)

Articles

Methods and means of term extraction from texts for terminological tasks
Bolshakova E.I., Semak V.V.
Abstract

The paper describes the current state in the field of automatic term extraction from specialized natural language texts, including scientific and technical documents. Practical applications of methods and tools for extracting terms from texts include creation of terminological dictionaries, thesauri, and glossaries of problem oriented domains, as well as extraction of keywords and construction of subject indexes for specialized documents. The paper overviews approaches to automatic recognition and extraction of terminological words and phrases, which cover traditional statistical methods and methods based on machine learning by using term features or with modern neural network transformer-based language models. A comparison of the approaches is presented, including quality assessments for term recognition and term extraction. The most well-known software tools for automating term extraction within the statistical approach and learning by features are indicated. Authors' studies on term recognition based on neural network language models are described, being applied to Russian scientific texts on mathematics and programming. The data set with terminological annotations created for training term recognition models is briefly characterized, the dataset covers the data from seven related domains. The term recognition models were developed on the basis of pre-trained neural network model BERT, with its additional training (fine-tuning) in two ways: as binary classifier of candidate terms (previously extracted from texts) and as classifier for sequential labeling words in texts. For the developed models, the quality of term recognition is experimentally evaluated, and a comparison with the statistical approach was carried out. The best quality is demonstrated by binary classification models, significantly surpassing the other considered approaches. The experiments also show the applicability of the trained models to texts in close scientific domains.

Software & Systems. 2025;38(1):5-16
pages 5-16 views
Automated identification of hype technologies: Semantic analysis
Loginova I.V., Piekalnits A.S., Sokolov A.V.
Abstract

The research focuses on inflated public expectations of new technologies, or hypes. The paper presents the results of the development and testing an automated methodology for the automated identification of hypes among technological topics based on their textual trace in the digital technology field. The amount of new technological developments in the world is constantly growing, however their real potential for practical application varies greatly. Therefore, it is important to understand reliable factors to distinguish trends from hypes. Typically, industry and technology experts suggest that the possible signs of hypes include the following ones: absence of a stable business model, an unformed or obviously limited consumer market, and a large number of more effective alternatives. Identifying hypes in the technology agenda remains a difficult analytical task. This is due to the terminological inconsistency, the expert nature of the task, insufficiently developed methodological approaches, and the lack of specific technical tools. The method described in this paper involves the extraction of terms referring to technologies using natural language processing and computational linguistics techniques. These terms are extracted from several dozens of millions of different types of text documents, such as scientific publications, patents, and market analytics. The method also includes calculating an objective measure of each technology's “hype” and constructing a visual map that illustrates the technology landscape that allows separating sustainable trends from potential hypes. Decision makers can use such hype maps in conjunction with other analytical results to identify priority development areas, analyze current and forecast future trends, and in risk management.

Software & Systems. 2025;38(1):17-26
pages 17-26 views
Genetic algorithm for placing requirements in a flow-type production process planning problem
Kibzun A.I., Rasskazova V.A.
Abstract

The paper discusses the problem of planning flow-type production processes. In terms of a cascade scheme, the complex solution covers the stage of assigning preparatory units and the subsequent stage of forming detailed technological routes to fulfill a given set of requirements on time and taking into account the constraints on permissible processing durations at each processing stage. This scheme comes as a part of a problem-oriented computing complex. However, due to a number of natural reasons, the problem may become inconsistent right at the stage of assigning preparatory units. One of the ways to overcome these difficulties is to develop and implement penalty function algorithms to find the maximum joint subsystems in inconsistent optimization problems. The paper proposes an ideologically different approach for this purpose. It is based on considering the preliminary stage of requirement placement in such a way that the subsequent stages of problem-solving process are guaranteed to be solvable. The requirement placement is formalized as a search for an optimal mapping that minimizes “potential” workload on preparatory units during the planning period. To solve this problem, the authors of the paper have developed a genetic algorithm, which resulted in a significant advantage in terms of speed in comparison with fundamental approaches of mathematical programming (for example, integer linear programming models). In order to reduce the risk of population extinction at each iteration of the genetic algorithm, the authors apply the rule of unconditional migration of a representative with the lowest criterion value. This approach also provides effective convergence indices of the algorithm in terms of the number of iterations without significant improvement of the objective function. The developed genetic algorithm is implemented as a stand-alone module of a computing system for solving process manufacturing scheduling problems. The authors conducted a computational experiment using this module in terms of a comparative analysis of the solution quality of the initial complex problem.

Software & Systems. 2025;38(1):27-38
pages 27-38 views
Extracting structured data from the Chronicle of the Life and Work of A.S. Pushkin: Hybrid approach
Kokorin P.P., Kotov A.A., Kuleshov S.V., Zaytseva A.A.
Abstract

The paper discusses the problem of creating a software infrastructure for systematization, annotation, storage, search and publication of manuscripts and other digital materials. The research focuses on the materials related to the life and work of A.S. Pushkin. These materials form an important part of the scientific and educational resource “Pushkin Digital”. The problem is relevant due to the need to preserve the Russian author's heritage under conditions of digital transformation of philological, source and bibliographic studies into their works. This is a part of the national projects of the Russian Federation “Education”, “Culture”, “Science and Universities”. It is especially important to extract a structured text from bitmap images of pages from A.S. Pushkin's Chronicle of Life and Work volumes to use it in the developing systems of storage, systematization, publication of library, archival, museum, phonographic and other funds and collections and partial automation of philological, source and bibliographic research. The paper proposes a hybrid approach based on the a priori data about the structure of page layout elements, OCR technologies (text recognition based on Tesseract library) and verification methods. The peculiarity of the developed verification methods is using regular expressions for extracting structured data from pre-recognized text and automated text processing pipeline in the GitLab assembly system. The paper demonstrates satisfactory results of the proposed hybrid approach. The approach minimizes the manual post-processing of the obtained data by proofreading the results posted on the research and education resource. The results are useful not only in the research and educational resource Pushkin Digital under development, but also in other projects, which require recognition and automated processing of large volumes of digitized author's texts, archival and other paper documents.

Software & Systems. 2025;38(1):39-46
pages 39-46 views
Modeling the reliability of software components of cyber-physical systems
Privalov A.N., Larkin E.V., Bogomolov A.V.
Abstract

This relevance of the research is due to the reliability of software components of cyber-physical systems being a key component of their effective functioning. Its appropriate mathematical modeling is essential for the economy digitalization. The paper aims to eliminate the disadvantages of known approaches to modeling the reliability of software components, when the estimates of reliability characteristics are based on empirical data on the errors detected during the testing of programs. Hence, the testing results mostly depend on test duration and on the completeness of the processed data area coverage by the sub-area of the data generated during testing. This reduces the efficiency of reliability estimation. The research focuses on the reliability modeling methods for software components of cyber-physical systems. The reliability is characterized by the delay time in the feedback loop between components. The authors of the paper used methods of software engineering, reliability theory, probability theory and Markov processes. The main result is mathematical reliability models of software components of cyber-physical systems. They combine semi-Markov models of software components, generations of their faults and failures. The developed mathematical models are based on the structural-parametric semi-Markov model of software faults and failures. Its parameters are determined by the computational complexity and requirements to the software taking into account its functional purpose. The authors obtained formalized descriptions of Poisson flows of faults and failures of software components of a cyber-physical system. The practical relevance of the paper is due to its application for determining the reliability of software components at all stages of a cyber-physical system life cycle, where elements interact, self-adjust and adapt to changes using standard software-implemented protocols.

Software & Systems. 2025;38(1):47-54
pages 47-54 views
Planning computations in real-time systems: Efficient algorithms for constructing optimal schedules
Kononov D.A., Furugyan M.G.
Abstract

The paper discusses issues related developing one of the main blocks of a real-time computing system, specifically the computation scheduling block. The authors propose algorithms for constructing optimal schedules for different cases depending on the number of processors and characteristics of works and computing system resources. For the single-processor case with interruptions and directive intervals, they improved the relative urgency algorithm using a heap for data storage. This contributed to lowering the algorithm computational complexity. The authors also developed an algorithm for a problem with a partial order of job execution. It bases on the pre-correction of ready moments and directive deadlines and on the reduction of the original task to a task without precedence relations. For the multiprocessor case with interruptions and directive intervals, the authors proposed an approximate algorithm that is based on a generalization of the single-processor relative urgency algorithm to the multi-processor case. The authors performed a comparative analysis with the exact stream algorithm. They proved that the problem is NP-hard when interruption and switching time costs are taken into account. For the multiprocessor case without interruptions and switches with a common directive interval for all works and identical processors, the authors developed a pseudo-polynomial algorithm based on a limited search of options. The authors also created an approximate algorithm for a system with renewable and non-renewable resources, as well as for a complex with a mixed set of works (both continuous and allowing interruptions and switching). The algorithm is based on network modeling and reducing the problem under study to the search for a stream with certain properties in a special network.

Software & Systems. 2025;38(1):55-64
pages 55-64 views
A system of verifiable software component specifications with embedding and extraction
Shapkin P.A.
Abstract

This paper focuses on specification and verification of software systems and their components. It researches a unified specification language that correlates with both random testing systems and static verification tools based on type systems. A variety of programming languages, configuration systems, deployment and other tools require developers to make efforts to integrate them. Verifiable component specifications help to simplify the task. The paper proposes an approach to a unified specification representation integrated with systems for both static type checking and dynamic testing. This solution relies on methods of applicative computing and type theory. It is a conceptual framework for building specifications embedded in various software environments. The lack of static verification capabilities due to limited type systems is eliminated by dynamic testing to some extent. The author implements testing by interpreting specifications into definitions for property-based random testing systems. The practical significance of the proposed approach is automation of the process of constructing typed wrappers, or facades, which are essential for using components from less typed environments in programming languages with more expressive type systems. The approach automates both the verification of such wrappers and the methods of their construction by defining specification refinement operations. In practice, this allows detecting errors in typing of third-party components at early development stages. The paper gives examples of program specifications with side effects. A basis for specifications is category theory formalizations. The author also analyzes approaches to translating specifications into other representations and to iteratively improving specifications by transforming them.

Software & Systems. 2025;38(1):65-76
pages 65-76 views
Simulation modeling of physical protection systems in AKIM environment
Senichenkov Y.B., Sharkov I.K.
Abstract

The paper discusses the methodology of building simulation models in the domestic software package AKIM. The models focus on solving the problem of analyzing the security level of existing and designed systems of physical protection of objects. They also use statistical experiment to form estimates of such systems’ effectiveness. The authors give a review of existing modern approaches to solving a similar problem. Most approaches apply Markov chains to search for vulnerable paths, as well as attack and defense graphs to assess the system effectiveness. Alternatively, it is suggested to build a simulation model without building an attack and defense graph, relying only on a physical defense system plan. The model in the AKIM environment consists of base class instances that model real elements of a physical protection system. As a result, there is a plan for models of agents and guards to move, simulating real attacks. The approach allows describing in detail the functions, reactions and capabilities of the system at the level of its elements and specifying the actual parameters of intruders and guards. This ensures accuracy and completeness of the analysis without simplification or exclusion of important details. Demonstration examples show that efficiency estimates of system protection models obtained by AKIM software package are close to efficiency estimates of system models built using Markov chains. In this case, the considered method of building simulation models allows overcoming the difficulties associated with using Markov chains: the need to use expert estimates of the coefficients of the transition matrix, large size matrices, the complexity of model modification.

Software & Systems. 2025;38(1):77-88
pages 77-88 views
Author's metric for assessing proximity of programs: Application for vulnerability search using genetic de-evolution
Buynevich M.V., Izrailov K.E.
Abstract

The paper is relevant due to the tasks in the field of information security that require comparison of programs in their different representations, for example, in text assembly code (e.g., for vulnerability search or authorship verification). The paper presents a proximity metric for two texts in the form of a character string list, which is a development of its previous author's version. The main result of the current study (as a part of the main study aimed at genetic de-evolution of programs) is the metric itself, as well as its characteristics and peculiarities revealed through experiments. The paper presents the metric in analytical form implemented in Python. The metric takes at the input two lists of character lines for comparison, and the coefficients of taking into account the element position from the beginning of the list and the character sequence. The calculation result is a numeric value in the range from 0 to 1. Metric's novelty is in a sufficiently accurate and sensitive assessment of two texts' proximity regardless of data representation formats. The current metric version differs from the previous one by taking into account the mentioned coefficients. Theoretical significance lies in the development of comparing methods for arbitrary texts that are a list of character lines containing information, which appears sequentially according to a certain logic (requires position consideration). Besides the general purpose of comparative tools like this, the metric is practically relevant due to the possibility of determining the proximity of two programs. These programs have a binary representation of the machine code. It is pre-transformed into a textual representation of an assembly code.

Software & Systems. 2025;38(1):89-99
pages 89-99 views
A framework for automating equipment remaining life prediction when building proactive decision support systems
Zadiran K.S., Volkova D.A., Shcherbakov M.V.
Abstract

The paper describes a framework for automating research in designing proactive decision support systems.
In particular, it investigates the problem of time series analysis and prediction in order to create tools for automating prediction of various processes in asset management systems, including maintenance and repair. The authors identify the role of automation processes in asset management in these systems. They highlight the main factors influencing the choice of a program to implement a predictive analytics system. The authors propose an algorithm for solving the problem of predicting remaining useful life based on analyzing production asset data using artificial intelligence components. The proposed software solution is based on CRISP-DM. It is not a separate software product and can be embedded in existing software to support the possibility of modifying methods. The framework loads and preprocesses data, builds predicting models, predicts time series and evaluates the forecast. The developed framework has a flexible modular architecture for adding new methods of analysis and predicting. The possibility to redefine and implement own data sources, preprocessing stages, forecasting models and metrics on the basis of existing base classes extends the variability and increases the efficiency of its functioning. There is the example of using the framework to solve the problem of analyzing time series and determining equipment remaining useful life. It demonstrates the efficiency of the developed product in the field of data exploration and artificial intelligence.

Software & Systems. 2025;38(1):100-107
pages 100-107 views
Solving approximation problems: Implementing a neuro-fuzzy network model based on a Bayesian logical-probabilistic approach
Khamchichev G.A., Kozhomberdieva G.I.
Abstract

The paper describes the implementation of a neuro-fuzzy network (NFN) model based on Bayesian logical-probabilistic fuzzy inference model as a specialized software tool. In addition to the NFN model itself, it implements adapted algorithms for constructing (grid partitioning algorithm for generating fuzzy rules) and training (error backpropagation algorithm and hybrid algorithm) NFN known from ANFIS network in MATLAB. The designed software tool is written in Java. It solves practical problems and investigates the effectiveness and applications of NFN based on the Bayesian logical-probabilistic fuzzy inference model. The authors discuss the experience of building and training the proposed NFN type using the developed toolkit. The authors also consider the creation and training examples of NFNs designed to solve specific approximation problems of multivariable functions based on real open and synthetic data sets. The paper compares the results obtained using the developed tool and the ANFIS tool from MATLAB. The authors confirmed that the proposed NFN model can be a universal approximator of complex functional dependencies. This confirms its efficiency and possibilities of using it in different fields. The inclusion of various approximation quality metrics in the program allows evaluating network training quality, accuracy, stability, and adaptability to new data comprehensively. Access restrictions for Russian users to foreign commercial software enhance the practical significance of the developed software tool based on the original NFN model; it makes the development relevant and useful for a wide range of users.

Software & Systems. 2025;38(1):108-121
pages 108-121 views
Design principles of a domestic platform for scientific dataset exchange
Garev K.V.
Abstract

The paper considers prerequisites and proposals for establishing a domestic platform for a scientific dataset exchange in terms of global trends related to the development of open science and spread of FAIR-principles. The authors analyze foreign initiatives (EUDAT, EOSC, DataONE, Dryad, Zenodo) to identify key problems that hinder effective use, preservation and reuse of scientific data. These problems include the lack of uniform regulations for dataset description, disparate infrastructure solutions, insufficient cross-platform interoperability, and difficulty in ensuring reproducibility of research. The authors focus on the role of the professional community, the importance of creating an environment for sharing experience, conducting interdisciplinary projects, and improving skills in working with large datasets. The paper emphasizes the need to systematize the work with scientific data and to unify the requirements for their collection, storage, processing and presentation. This will increase the transparency of research processes. The paper substantiates the feasibility of implementing distributed storage mechanisms, federated authentication and high-performance computing resources capable of meeting the needs of the domestic scientific community. Finally, it outlines proposals for designing a unified platform for scientific dataset exchange: from developing methodological regulations and standards for interaction with external systems to principles of integrating analytical tools and ensuring reliable data protection.

Software & Systems. 2025;38(1):122-133
pages 122-133 views
Applying specialized software packages to automate engineering equipment calculation
Osipov E.V., Khomenko A.A., Osipova L.E.
Abstract

The purpose of this paper is to develop an approach in order for engineering methods of equipment calculation to form as separate programs integrated with simulation-based CAD systems, and the results to be input data for a CAD-system. The research focuses on toluene settling from water-methanol mixture, the particular emphasis is on non-standard technological equipment of the settling process (settler). The authors present technological calculations for material and energy balances in a universal simulation program. They determine the geometric dimensions of non-standard equipment using the developed program. A 3D model of the equipment is formed in a graphical CAD-system. The approach is implemented in the form of a program in VBA language, which is connected with the model of a settling process. VBA language is basic within the software used and widespread in automating calculations within the system. Since the program database does not contain a settler calculation module, the authors have developed this module using the Import User Model tool. A parametric analysis was conducted using the Case Study add-on. The authors identified the maximum allowable methanol content in the initial mixture (up to 0.3 mass fraction), at which the density of the water-methanol mixture exceeds the density of the toluene phase. The authors calculated the main dimensions of the equipment using engineering methods, which are implemented as a separate program linked to the model in the simulation CAD system. The authors have developed a 3D model of the equipment body in a graphical CAD system using variables. It is the basis of the assembly of the settler, supplemented with standard elements from the program database. As a result, the developed program determines settler dimensions; it links modeling and graphical CAD system. It allows automating engineering methods of energy efficient equipment calculation.

Software & Systems. 2025;38(1):134-142
pages 134-142 views
Temperature field modeling in additive manufacturing of metal products
Kakorin D.D., Margolis B.I.
Abstract

The paper substantiates the necessity to study the nature of temperature distribution in additive manufacturing of metal products. It considers the peculiarities of modeling a temperature field, which arises during layer-by-layer electric arc welding of flat geometric metal parts taking into account asymmetric convection-radiation heat exchange between the surface and the environment. The authors described in detail the temperature field calculation in two-dimensional spatial coordinates based on numerical finite-difference methods. They took into account the possibility of shifting the starting point of the welded layer from the base edge, changing the holding time between the welded layers and the use of forced air metal cooling. The authors considered the mechanism of temperature determination in the structure boundary points, directly in contact with the molten metal. They also developed a program in MatLab that allows modeling the temperature field in the product based on specified thermophysical characteristics of the clad metal, parameters of convective-radiation heat exchange and geometric characteristics of the welded structure. The paper gives texts of TempSurfacing functions for modeling the temperature field and TempDepend for taking into account the dependence of thermophysical properties of metal on its temperature before surfacing a new layer. The authors checked the performance of the program on the example of surfacing a single layer of metal 2 mm high on a metal base 5 mm high. There is a view of the program window for inputting initial data, and the results of modeling the temperature field in two-dimensional spatial coordinates in text and graphical forms. The obtained temperature field model allows taking into account the new metal parts addition along the welded layer length, as well as the systematic increase in the structure height due to layer-by-layer metal welding. The developed model serves to establish the optimal thermal cycle of the layer-by-layer arc welding process and to identify the heat transfer conditions taking into account changes in the operating parameters of the additive manufacturing technological process.

Software & Systems. 2025;38(1):143-149
pages 143-149 views
Optimization of technological processes for nesting parts: Using a database of sheet cutting cost parameters
Tavaeva A.F., Petunin A.A.
Abstract

The paper considers an integrated nesting and routing task. It combines the problems of nesting optimization and cost minimization of the cutting process with numerical control (CNC). To solve the problem and develop appropriate optimization algorithms, it is necessary to obtain scientifically substantiated data. Specifically, the data on the values of the cost parameters of a sheet cutting process on the CNC technological equipment for different grades and thicknesses of processed materials. The paper describes a database of such cost process parameters. There is also its diagram, basic structure and organization. The authors chose a relational model to store the data on cost parameters. It consists of 8 tables and contains information on grades and thicknesses of materials processed, cost and density values of materials, cost per unit of cutting tool path at working and idle speeds, as well as one piercing point. Usability of working with data in the database (visualization, adding new records, deleting records, calculating cost parameters for new material grades and thicknesses, changing parameter values) is provided by the designed software in Python. The authors propose to use it either as a separate product or together with the software for automatic design of control programs for CNC sheet cutting equipment to solve practical problems. The authors show this work significance on a model example of designing nesting and tool routing for one type of a laser cutting machine.

Software & Systems. 2025;38(1):150-156
pages 150-156 views
Improving ergonomic parameters of head-mounted display in augmented reality glasses for civil aviation
Greshnikov I.I., Davydov D.A., Gonchar B.I., Sokolov A.V., Konovalova D.V.
Abstract

The paper analyzes existing solutions in the field of head-mounted display. The authors offer their own concept of pilot and navigation information presentation in augmented reality glasses for applying in civil aviation. The main purpose of the developed display is to provide the pilot with the necessary flight and navigation information at the target flight phase. The research method consists in analyzing the existing solutions, identifying their shortcomings and adjusting the developed indication based on the recommendations of experienced pilots. The authors have identified the main features of flight and navigation information presentation that contribute to the ergonomic performance improvement. For example, using independent graphical layers of flight and navigation information objects, changing image layout depending on a flight phase, using several colors, voice control to display the required display layer. Consequently, the authors presented the software that allows projecting graphic images of flight and navigation information mnemonic symbols into the cockpit external space simulated in Unity environment. These display objects are placed according to the following parameters or to their combination: position of the glasses, path velocity vector, airplane construction axis, cockpit external space. The developed software also allows adding graphic layers from synthesized and enhanced vision systems. The paper presents a head-mounted display demonstrator based on this concept. The authors reviewed the architecture and functionality of the demonstrator including pilot-visible images in augmented reality glasses. The practical significance of this work is in the improvement of head-mounted display ergonomic parameters in comparison with the existing display variants of pilot navigation information.

Software & Systems. 2025;38(1):157-165
pages 157-165 views
Database on microbiological wastewater and soil treatment processes: An effective tool for data and knowledge representation
Dosaev A.A., Skichko A.S., Menshutina N.V.
Abstract

The research focuses on a system analysis of the subject area, particularly the processes of wastewater and soil microbiological treatment and the database development. The paper is relevant due to the problem of unstructured large volumes of incoming heterogeneous information in this area. The authors analyzed the literature describing the existing databases on microbiological treatment and indicating their advantages and disadvantages. They justified the relevance of developing a database that integrates all key components of microbiological treatment processes of wastewater and soil. The authors conducted a systematic analysis of the subject area. Using a systematic approach, they constructed a data storage architecture. They noted the advantages of the developed system and showed examples of search query execution. The developed the Microbiological Purification database contains extensive information on pollutants and microorganisms with descriptions of microbiological cleaning processes. The proposed data storage system is useful for researchers whose area of scientific interest is microbiological cleaning processes, microbiology, chemistry and chemical technology. It allows reducing the time for information retrieval during scientific research work.

Software & Systems. 2025;38(1):166-173
pages 166-173 views

Согласие на обработку персональных данных с помощью сервиса «Яндекс.Метрика»

1. Я (далее – «Пользователь» или «Субъект персональных данных»), осуществляя использование сайта https://journals.rcsi.science/ (далее – «Сайт»), подтверждая свою полную дееспособность даю согласие на обработку персональных данных с использованием средств автоматизации Оператору - федеральному государственному бюджетному учреждению «Российский центр научной информации» (РЦНИ), далее – «Оператор», расположенному по адресу: 119991, г. Москва, Ленинский просп., д.32А, со следующими условиями.

2. Категории обрабатываемых данных: файлы «cookies» (куки-файлы). Файлы «cookie» – это небольшой текстовый файл, который веб-сервер может хранить в браузере Пользователя. Данные файлы веб-сервер загружает на устройство Пользователя при посещении им Сайта. При каждом следующем посещении Пользователем Сайта «cookie» файлы отправляются на Сайт Оператора. Данные файлы позволяют Сайту распознавать устройство Пользователя. Содержимое такого файла может как относиться, так и не относиться к персональным данным, в зависимости от того, содержит ли такой файл персональные данные или содержит обезличенные технические данные.

3. Цель обработки персональных данных: анализ пользовательской активности с помощью сервиса «Яндекс.Метрика».

4. Категории субъектов персональных данных: все Пользователи Сайта, которые дали согласие на обработку файлов «cookie».

5. Способы обработки: сбор, запись, систематизация, накопление, хранение, уточнение (обновление, изменение), извлечение, использование, передача (доступ, предоставление), блокирование, удаление, уничтожение персональных данных.

6. Срок обработки и хранения: до получения от Субъекта персональных данных требования о прекращении обработки/отзыва согласия.

7. Способ отзыва: заявление об отзыве в письменном виде путём его направления на адрес электронной почты Оператора: info@rcsi.science или путем письменного обращения по юридическому адресу: 119991, г. Москва, Ленинский просп., д.32А

8. Субъект персональных данных вправе запретить своему оборудованию прием этих данных или ограничить прием этих данных. При отказе от получения таких данных или при ограничении приема данных некоторые функции Сайта могут работать некорректно. Субъект персональных данных обязуется сам настроить свое оборудование таким способом, чтобы оно обеспечивало адекватный его желаниям режим работы и уровень защиты данных файлов «cookie», Оператор не предоставляет технологических и правовых консультаций на темы подобного характера.

9. Порядок уничтожения персональных данных при достижении цели их обработки или при наступлении иных законных оснований определяется Оператором в соответствии с законодательством Российской Федерации.

10. Я согласен/согласна квалифицировать в качестве своей простой электронной подписи под настоящим Согласием и под Политикой обработки персональных данных выполнение мною следующего действия на сайте: https://journals.rcsi.science/ нажатие мною на интерфейсе с текстом: «Сайт использует сервис «Яндекс.Метрика» (который использует файлы «cookie») на элемент с текстом «Принять и продолжить».