Dr Richard Earl, Worcester College, University of Oxford
How much information is there in being told the roll of a die or the colour of a person’s eyes? A mathematical theory of information dates back to Shannon’s seminal 1948 paper. “Shannon entropy” measures how much information is being conveyed (e.g. by a person speaking) and can be shown to be a lower bound for just how quickly that information can be encoded and transmitted. The talk also discusses some of the main issues of coding - optimal codes, instantaneous codes, error-correcting codes, and a little on cryptography.