TY - GEN
T1 - Implicit channel estimation for ML sequence detection over finite-state Markov communication channels
AU - Krusevac, Zarko B.
AU - Kennedy, Rodney A.
AU - Rapajic, Predrag B.
PY - 2006
Y1 - 2006
N2 - This paper shows the existence of the optimal training, in terms of achievable mutual information rate, for an output feedback implicit estimator for finite-state Markov communication channels. Implicit (blind) estimation is based on a measure of how modified is the input distribution when filtered by the channel transfer function and it is shown that there is no modification of an input distribution with maximum entropy rate. Input signal entropy rate reduction enables implicit (blind) channel process estimation, but decreases information transmission rate. The optimal input entropy rate (optimal implicit training rate) which achieves the maximum mutual information rate, is found.
AB - This paper shows the existence of the optimal training, in terms of achievable mutual information rate, for an output feedback implicit estimator for finite-state Markov communication channels. Implicit (blind) estimation is based on a measure of how modified is the input distribution when filtered by the channel transfer function and it is shown that there is no modification of an input distribution with maximum entropy rate. Input signal entropy rate reduction enables implicit (blind) channel process estimation, but decreases information transmission rate. The optimal input entropy rate (optimal implicit training rate) which achieves the maximum mutual information rate, is found.
UR - http://www.scopus.com/inward/record.url?scp=33750943015&partnerID=8YFLogxK
M3 - Conference contribution
SN - 1424402131
SN - 9781424402137
T3 - Proceedings - 7th Australian Communications Theory Workshop, 2006
SP - 130
EP - 136
BT - Proceedings - 7th Australian Communications Theory Workshop, 2006
T2 - 7th Australian Communications Theory Workshop, 2006
Y2 - 1 February 2006 through 3 February 2006
ER -