ITW 2015


Andrew R. Barron, Yale Univ., Statistics Dept.

This presentation explores information theory and statistics themes in topics selected from among the the following:  penalized likelihood concentration and risk bounds derivable from information theory inequalities; greedy algorithms for vertex selection for projection onto convex hulls (and associated algorithms for term selection in regression and neural nets); infomation-theoretic characterization of minimax risk asymptotics; provably fast capacity-achieving codes for the additive white Gaussian noise channel;  general entropy power inequalities and entropic-central limit theorems; monotonicity of relative entropy in Markov chains and speculations concerning discrete-time, collision-based theory of entropy in statistical mechanic.