Channel equalization and the Bayes point machine

Edward Harrington*, Jyrki Kivinen, Robert C. Williamson

*Corresponding author for this work

    Research output: Contribution to journalConference articlepeer-review

    Abstract

    Equalizers trained with a large margin have an ability to better handle noise in unseen data and drift in the target solution. We present a method of approximating the Bayes optimal strategy which provides a large margin equalizer, the Bayes point equalizer. The method we use to estimate the Bayes point is to average N equalizers that are run on independently chosen subsets of the data. To better estimate the Bayes point we investigated two methods to create diversity amongst the N equalizers. We show experimentally that the Bayes point equalizer for appropriately large step sizes offers improvement on LMS and LMA in the presence of channel noise and training sequence errors. This allows for shorter training sequences albeit with higher computational requirements.

    Original languageEnglish
    Pages (from-to)493-496
    Number of pages4
    JournalProceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing
    Volume4
    Publication statusPublished - 2003
    Event2003 IEEE International Conference on Accoustics, Speech, and Signal Processing - Hong Kong, Hong Kong
    Duration: 6 Apr 200310 Apr 2003

    Fingerprint

    Dive into the research topics of 'Channel equalization and the Bayes point machine'. Together they form a unique fingerprint.

    Cite this