Skip to Main Content
A proof of Shannon's source coding theorem is given using results from large deviation theory. In particular Sanov's theorem on convergence rates for empirical distributions is invoked to obtain the key large deviation result. This result is used directly to prove the source coding theorem for discrete memoryless sources. It is then shown how this theorem can be extended to ergodic Polish space valued sources and continuous distortion measures.