-
Rényi, Nagyterem
-
-
-
-
-
-

Description

Abstract:

Shared information has been proposed as a potential
measure of mutual dependence among multiple jointly
distributed discrete random variables. For two random
variables, it particularizes to Shannon's celebrated
and enormously consequential mutual information.
Properties of shared information and specific operational
meanings will be described. This talk is based on
joint works over the years with Imre Csiszár,
Sirin Nitinawarat, Himanshu Tyagi and Sagnik Bhattacharya.