skip to content

Statistical Laboratory

Publications

Entropy in Data Compression, Additive Combinatorics and Probability
L Gavalakis
(2022)
Information in probability: Another information-theoretic proof of a finite de Finetti theorem
L Gavalakis, I Kontoyiannis
(2022)
The Entropic Central Limit Theorem for Discrete Random Variables
L Gavalakis, I Kontoyiannis
– 2022 IEEE International Symposium on Information Theory (ISIT)
(2022)
2022-June,
708
Information-theoretic de Finetti-style theorems
L Gavalakis, I Kontoyiannis
– 2022 IEEE INFORMATION THEORY WORKSHOP (ITW)
(2022)
00,
71
An information-theoretic proof of a finite de finetti theorem
L Gavalakis, I Kontoyiannis
– Electronic Communications in Probability
(2021)
26,
1
Entropy and the Discrete Central Limit Theorem
L Gavalakis, I Kontoyiannis
(2021)
An Information-Theoretic Proof of a Finite de Finetti Theorem
L Gavalakis, I Kontoyiannis
(2021)
Fundamental Limits of Lossless Data Compression With Side Information
L Gavalakis, I Kontoyiannis
– IEEE Transactions on Information Theory
(2021)
67,
2680
Sharp Second-Order Pointwise Asymptotics for Lossless Compression with Side Information.
L Gavalakis, I Kontoyiannis
– Entropy (Basel)
(2020)
22,
E705
Lossless Data Compression with Side Information: Nonasymptotics and Dispersion
L Gavalakis, I Kontoyiannis
– IEEE International Symposium on Information Theory - Proceedings
(2020)
00,
2179
  • 1 of 2
  • >

Frontpage talks

13
Mar
16:00 - 17:00: Title to be confirmed
Peter Whittle Lecture

13
Mar
16:00 - 17:00: Title to be confirmed
Peter Whittle Lecture

Research Group

Statistical Laboratory

Room

D2.06