TY - GEN

T1 - On the entropy rate of word-valued sources

AU - Timo, R.

AU - Blackmore, K.

AU - Hanlen, L.

PY - 2007

Y1 - 2007

N2 - A word-valued source Y is a discrete finite alphabet random process which is created by encoding a discrete random process X with a symbol-to-word function f. In Information Theory (in particular source coding), it is of interest to know which word valued sources possess an entropy rate H̄(Y). Nishiara and Morita showed that if X is independent and identically distributed and f is prefix free, then H̄(Y) exists and is equal to H̄(X) divided the expected codeword length. This "conservation of entropy" result was latter extended by Goto, Matsushima and Hirasawa to include stationary and ergodic X. In this paper, we extend these results to ergodic and Asymptotically Mean Stationary (AMS) X: If X is Asymptotically Mean Stationary (AMS), then H̄(Y) is equal to the expectation of the entropy rate of each stationary ergodic sub-source of X divided by the expected codeword length of that sub-source. The second result in this paper solves an open problem concerning the existence of H̄(Y) when f is not prefix free. If X is Asymptotically Mean Stationary (AMS) and f is not prefix free, then H̄(Y) exists and is upper bound by the expectation of the entropy rate of each stationary ergodic sub-source of X divided by the expected codeword length of that sub-source. The theoretical results presented in this paper may be applied to problems in source coding, telecommunications and networking.

AB - A word-valued source Y is a discrete finite alphabet random process which is created by encoding a discrete random process X with a symbol-to-word function f. In Information Theory (in particular source coding), it is of interest to know which word valued sources possess an entropy rate H̄(Y). Nishiara and Morita showed that if X is independent and identically distributed and f is prefix free, then H̄(Y) exists and is equal to H̄(X) divided the expected codeword length. This "conservation of entropy" result was latter extended by Goto, Matsushima and Hirasawa to include stationary and ergodic X. In this paper, we extend these results to ergodic and Asymptotically Mean Stationary (AMS) X: If X is Asymptotically Mean Stationary (AMS), then H̄(Y) is equal to the expectation of the entropy rate of each stationary ergodic sub-source of X divided by the expected codeword length of that sub-source. The second result in this paper solves an open problem concerning the existence of H̄(Y) when f is not prefix free. If X is Asymptotically Mean Stationary (AMS) and f is not prefix free, then H̄(Y) exists and is upper bound by the expectation of the entropy rate of each stationary ergodic sub-source of X divided by the expected codeword length of that sub-source. The theoretical results presented in this paper may be applied to problems in source coding, telecommunications and networking.

UR - http://www.scopus.com/inward/record.url?scp=58149178179&partnerID=8YFLogxK

U2 - 10.1109/ATNAC.2007.4665292

DO - 10.1109/ATNAC.2007.4665292

M3 - Conference contribution

SN - 1424415578

SN - 9781424415571

T3 - 2007 Australasian Telecommunication Networks and Applications Conference, ATNAC 2007

SP - 377

EP - 382

BT - 2007 Australasian Telecommunication Networks and Applications Conference, ATNAC 2007

PB - IEEE Computer Society

T2 - 2007 Australasian Telecommunication Networks and Applications Conference, ATNAC 2007

Y2 - 2 December 2007 through 5 December 2007

ER -