Program

Below you can see the schedule of the conference. Abstracts can be seen by clicking on titles. Below the schedule the talks are listed once more according to alpabetic order of their lecturer's name.

Schedule

Monday, June 4

9:00 - Registration

9:30 Welcome and Introduction

9:45 - 10:25 Sergio Verdú: Imre Csiszár and Information Measures

10:30 - 11:10 Elisabeth Gassiat:Information Theory, Order Estimation and Likelihoods

11:10 - 11:30 Coffee break

11:30 - 12:10 Flemming Topsøe: Information and Games, Some Philosophical and Technical Aspects

12:15 - 12:55 Igal Sason: On Csiszár's f-Divergences and Informativities with Applications

12:55 --- 14:45 Lunch break

14:45 - 15:25 János Körner:Learning from Csiszár

15:30 - 16:10 Michelle Effros: To Give a Bit of Information May Make a Real Difference

16:10 - 16:45 Coffee break

16:45 - 17:25 Péter Gács: Information-theoretic Relations in Algorithmic Information Theory

17:30 - 18:10 László Csirmaz: Fero Matúš' Work on the Shape of the Entropy Region


19:00 Banquet


Tuesday, June 5

9:30 - 10:10 Andrew Barron: Approximating Multi-layer Learning Networks

10:15 - 10:55 Thomas Breuer: Csiszár on Risk

10:55 - 11:20 Coffee break

11:20 - 12:00 : Katalin Marton: Strong Entropy Contraction Property of Markov Kernels and its Application to Gibbs Samplers

12:05 - 12:45 Milán Mosonyi:Generalized Cutoff Rates in Quantum Information Theory

12:45 --- 14:30 Lunch break

14:30 - 15:10 Prakash Narayan: Sampling Rate Distortion

15:15 - 15:55 Gábor Tusnády: Where Does Information Come from?

16:00 - 16:40: Raymond Yeung Information Diagrams for Markov Random Fields



List of talks (in alphabetic order by lecturer)

Andrew Barron: Approximating Multi-layer Learning Networks

Thomas Breuer: Csiszár on Risk

László Csirmaz: Fero Matúš' Work on the Shape of the Entropy Region

Michelle Effros: To Give a Bit of Information May Make a Real Difference

Péter Gács: Information-theoretic Relations in Algorithmic Information Theory

Elisabeth Gassiat:Information Theory, Order Estimation and Likelihoods

János Körner:Learning from Csiszár

Katalin Marton: Strong Entropy Contraction Property of Markov Kernels and its Application to Gibbs Samplers

Fero Matúš' Work on the Shape of the Entropy Region by László Csirmaz

Milán Mosonyi:Generalized Cutoff Rates in Quantum Information Theory

Prakash Narayan: Sampling Rate Distortion

Igal Sason: On Csiszár's f-Divergences and Informativities with Applications

Flemming Topsøe: Information and Games, Some Philosophical and Technical Aspects

Gábor Tusnády: Where Does Information Come from?

Sergio Verdú: Imre Csiszár and Information Measures

Raymond Yeung Information Diagrams for Markov Random Fields



Back to main page