Abstract
Equalizers trained with a large margin have an ability to better handle noise in unseen data and drift in the target solution. We present a method of approximating the Bayes optimal strategy which provides a large margin equalizer, the Bayes point equalizer. The method we use to estimate the Bayes point is to average N equalizers that are run on independently chosen subsets of the data. To better estimate the Bayes point we investigated two methods to create diversity amongst the N equalizers. We show experimentally that the Bayes point equalizer for appropriately large step sizes offers improvement on LMS and LMA in the presence of channel noise and training sequence errors. This allows for shorter training sequences albeit with higher computational requirements.
Original language | English |
---|---|
Pages (from-to) | 493-496 |
Number of pages | 4 |
Journal | Proceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing |
Volume | 4 |
Publication status | Published - 2003 |
Event | 2003 IEEE International Conference on Accoustics, Speech, and Signal Processing - Hong Kong, Hong Kong Duration: 6 Apr 2003 → 10 Apr 2003 |