Abstract
Orthogonal frequency division multiplexing (OFDM) schemes are being considered for both optical fibre and wireless optical communications mainly because their ability to cope with the high inter-symbol interference levels associated to dispersive channels. DC biased optical OFDM (DCO-OFDM) is a promising technique that meet the particularities of intensity modulated OFDM schemes. To reduce the required DC component OFDM signals need to be submitted to an asymmetric clipping. However, the resulting non-linear distortion effects can lead to significant performance degradation and/or undesirable out-of-band radiation. Therefore it is important to evaluate the impact of an asymmetric clipping on DC-biased optical OFDM signals. In this work, the authors take advantage of the Gaussianlike nature of OFDM signals and they study analytically the impact of an asymmetric clipping operation in the performance of DCO-OFDM schemes. The authors present accurate theoretical expressions for the power spectral density and the non-linear distortion at the subcarrier level, which can be used for obtaining the bit error rate of conventional receivers. The authors also study the optimum performance of these non-linear OFDM schemes and the authors show that for an optimum maximum likelihood detection the non-linear distortion instead of leading to performance degradation can actually improve it, even outperforming conventional, linear OFDM schemes.
Original language | English |
---|---|
Pages (from-to) | 969-974 |
Number of pages | 6 |
Journal | IET Communications |
Volume | 9 |
Issue number | 7 |
DOIs | |
Publication status | Published - 7 May 2015 |
Keywords
- OFDM SIGNALS
- SYSTEMS
- COMMUNICATION
- CHANNELS