01_The philosophy behind imformation theory
Carpe Tu Black Whistle

写在前面

  1. 看的网课是:台湾交通大学-陈伯宁(Po-ning chen)
  2. 主要证明方法: 大数定理
  3. 课本:An Introduction to Single-User Information Theory, Fady Alajaji and Po-Ning Chen.

Shannon’s theorem

  • Shannon first theorem -> source coding theorem
  • Shannon second theorem -> channel coding theorem
  • Shannon third theorem -> rate-distortion therem
  • Information transmission theorem (heavy-core) == joint source-channel coding theorem

Overview

Channel Coding Entropy

image
这个模块是 Channel Coding Theorem(second theorem)

Claude Shannon give a conclusive result.

Channel Capacity: 通信管道的传输是有上限的。不可能比超越 Capacity 即使是技术进步。

image

Source Entropy

image

随机信号源无损压缩的极限是source entropy

这里shannon的论文上写的是 Theorem2 但是是 Shannon First Theorem

Rate-Distortion

image

信息压缩要有点损失,传输的Entropy小于Channel Capacity的时候,损失(Distortion)和传送速率(Rate)的关系: Rate-Disortion

Random Coding Augment

  1. Shannon 提出的一种 “船新” 的证明模式
  2. 只证明了 最优传输方法的存在性
  3. 数学期望逼近0,那么必定存在方法让错误率->0

Introduction

Information

  • What’s the information

    • Uncertainty
      • information is a message that is previously uncertain to receivers
  • Representation of Information

    • After obtaining the information, one may wish to store it or convey it; this raises the question that:

how to represent information for ease of storing it or for ease of conveying it?

Representation of information

image

Code: 和 language 一样是有限符号集的排列。
不同的编码,就好似不同的人类语言

Dictionary and Codebook

有时候会把 book 省略

  • Assumption made by both transmitter and receiver about symbolized information
    • All “possible symbols” of the conveyed information are a priori known.
    • The receiver is only uncertain about which symbol is going to be received.
  • Example. In a conversation using English,
    • it si priori known that one of the vocabularies in an English dictionary is going to be spoken.
    • Just cannot tell which before its reception/
  • Example. In code digital communications,
    • the codebook(or simply code)–the collection of all possible concatenations of pre-defined symbols–is always a priori known (to the receiver).
    • Only uncertain about which is going to be receiced.

Compactness

Compactness是一个很古典的用词,等同于现代的 Compression

image

How to find the most compact code?

image

  • 推导最小的平均码长在所有可能的编码,再构建一个满足最小要求的编码
  • 通过measure将要传输的信息的多少

How to measure inforrmation

信息是纪律的函数

image

公设

  1. 几率相关: 事件越不可能发生,所携带的信息越多。
  2. 可加性:独立事件的信息量,等于两者相加。
  3. 连续性:事件发生几率的微小改变会引起信息量的微小改变。

The only “measure” satisfying these axioms is:


image

  • If Shannon’s statement is true, then the below definitions on information content are equivalent:
    • (Engineering view) The average cdeword length of the most compact code representing the information
    • (Probabilistic view) Entropy of the information
  • In 1948 Shannon proved that the above two views are actually equivalent(under some constraints). I.e., the minimum average code length for a source descriptive code is indeed equal to the entropy of the source.

One can compute the entropy of a source, and assures that if the average codeword length of a code equals the source entropy, the code is optimal.

Contribution of Shannon

  • Shannon’s work laid the foudation for the field of information theory.
  • His work indicates that the mathematical results of information theory can serve as a guide for the development of information manipulation systems.

image

  1. The measure of information is defined based on the definition of compactness
  2. the definition of measure of code compactness may be application-depandent.

image

虽然Shannon给出了一个conclusive result。仍然有许多基于应用的例子。

image

  • 为了对抗噪声,引入了编码冗余
  • 3比特传输一个信息,传输的效率也就降低为1/3 information symbol per channel usage

Information transmission

image
上面的coder其实都是encoder

  • Source Encoder是对冗余信息的压缩系->(压缩到最好)Entropy
  • Channel Encoder 跟Source Encoder和在一起设计,称为 joint source-channel encoder.

Shannon’s Separate Design

两个Encoder独立最优的时候,整个joint Encoder最优。

  • Source encoder
    • Find the most compact representation of the informative message.
  • Channel encoder
    • According to the noise pattern, add the redundancy so that the source code bits can be reliably transmitted.

image

Source encoder 资料压缩,挤出冗余信息。
Channel encoder 添加结构性冗余,以对抗 noise

compression rate

For source encoder, the system designer wishes to minimize the number of U’s required to represent one Z’s, i.e,

Compression Rate 越低越好

  • Shannon tells us that(for i.i.d. Z’s)

上式中,log的底数是信号u的事件集合大小 {0,1}的底就是2

  1. 最好的压缩方法,讯息的 source entropy=1
  2. 最好的压缩码就是uniformly distributed.

image

  • 反过来,如果输出的编码分布不是 uniformed那么一定不是最好的压缩码(Entropy<1)

transmission rate

Channel Encoder的输出序列为

image

Channel code rate每个X表示的U的量。