International Business Machines Corporation
Mutual Information Neural Estimation with Eta-Trick

Last updated:

Abstract:

A computing device receives a data X and Y, each having N samples. A function f(x,y) is defined to be a trainable neural network based on the data X and the data Y. A permuted version of the data Y is created. A loss mean is computed based on the trainable neural network f(x,y), the permuted version of the sample data Y, and a trainable scalar variable .eta.. A loss with respect to the scalar variable .eta. and the trainable neural network is minimized. Upon determining that the loss is at or below the predetermined threshold, estimating a mutual information (MI) between a test data X.sub.T and Y.sub.T. If the estimated MI is above a predetermined threshold, the test data X.sub.T and Y.sub.T is deemed to be dependent. Otherwise, it is deemed to be independent.

Status:
Application
Type:

Utility

Filling date:

9 Mar 2020

Issue date:

16 Sep 2021