Implicit channel estimation for ML sequence detection over finite-state Markov communication channels

Zarko B. Krusevac*, Rodney A. Kennedy, Predrag B. Rapajic

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    This paper shows the existence of the optimal training, in terms of achievable mutual information rate, for an output feedback implicit estimator for finite-state Markov communication channels. Implicit (blind) estimation is based on a measure of how modified is the input distribution when filtered by the channel transfer function and it is shown that there is no modification of an input distribution with maximum entropy rate. Input signal entropy rate reduction enables implicit (blind) channel process estimation, but decreases information transmission rate. The optimal input entropy rate (optimal implicit training rate) which achieves the maximum mutual information rate, is found.

    Original languageEnglish
    Title of host publicationProceedings - 7th Australian Communications Theory Workshop, 2006
    Pages130-136
    Number of pages7
    Publication statusPublished - 2006
    Event7th Australian Communications Theory Workshop, 2006 - Perth, Australia
    Duration: 1 Feb 20063 Feb 2006

    Publication series

    NameProceedings - 7th Australian Communications Theory Workshop, 2006
    Volume2006

    Conference

    Conference7th Australian Communications Theory Workshop, 2006
    Country/TerritoryAustralia
    CityPerth
    Period1/02/063/02/06

    Fingerprint

    Dive into the research topics of 'Implicit channel estimation for ML sequence detection over finite-state Markov communication channels'. Together they form a unique fingerprint.

    Cite this