Entropy Dimension Reduction Method for Randomized Machine Learning Problems


如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

The direct and inverse projections (DIP) method was proposed to reduce the feature space to the given dimensions oriented to the problems of randomized machine learning and based on the procedure of “direct” and “inverse” design. The “projector” matrices are determined by maximizing the relative entropy. It is suggested to estimate the information losses by the absolute error calculated with the use of the Kullback–Leibler function (SRC method). An example illustrating these methods was given.

作者简介

Yu. Popkov

Institute for Systems Analysis, Russian Academy of Sciences; Braude College of Haifa University; National Research University “Higher School of Economics,”

编辑信件的主要联系方式.
Email: popkov@isa.ru
俄罗斯联邦, Moscow; Carmiel; Moscow

Yu. Dubnov

Institute for Systems Analysis, Russian Academy of Sciences; National Research University “Higher School of Economics,”; Moscow Institute of Physics and Technology

Email: popkov@isa.ru
俄罗斯联邦, Moscow; Moscow; Moscow

A. Popkov

Institute for Systems Analysis, Russian Academy of Sciences; Peoples’ Friendship University

Email: popkov@isa.ru
俄罗斯联邦, Moscow; Moscow

补充文件

附件文件
动作
1. JATS XML

版权所有 © Pleiades Publishing, Ltd., 2018