Entropy Dimension Reduction Method for Randomized Machine Learning Problems


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

The direct and inverse projections (DIP) method was proposed to reduce the feature space to the given dimensions oriented to the problems of randomized machine learning and based on the procedure of “direct” and “inverse” design. The “projector” matrices are determined by maximizing the relative entropy. It is suggested to estimate the information losses by the absolute error calculated with the use of the Kullback–Leibler function (SRC method). An example illustrating these methods was given.

About the authors

Yu. S. Popkov

Institute for Systems Analysis, Russian Academy of Sciences; Braude College of Haifa University; National Research University “Higher School of Economics,”

Author for correspondence.
Email: popkov@isa.ru
Russian Federation, Moscow; Carmiel; Moscow

Yu. A. Dubnov

Institute for Systems Analysis, Russian Academy of Sciences; National Research University “Higher School of Economics,”; Moscow Institute of Physics and Technology

Email: popkov@isa.ru
Russian Federation, Moscow; Moscow; Moscow

A. Yu. Popkov

Institute for Systems Analysis, Russian Academy of Sciences; Peoples’ Friendship University

Email: popkov@isa.ru
Russian Federation, Moscow; Moscow

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2018 Pleiades Publishing, Ltd.