The required block length in OFDM schemes (Orthogonal Frequency Division Multiplexing) can be very high when we have severely time-dispersive channels. This means that the channel variations from block to block or even within a given block can be substantial. In this paper we consider OFDM systems with fast-varying channels where the variations are due to phase/frequency errors between local oscillators at the transmitter and the receiver, as well as the effects associated to non-constant Doppler drifts. The overall phase error can be approximated by a Taylor polynomial with small degree. Suitable training sequences are multiplexed with data blocks and used for estimating the Taylor coefficients of the phase error. These are then used for predicting the evolution of the phase error for subsequent data blocks, which is compensated before detection. Our performance results show that our phase predicting method can lead to significant performance improvements when we have significant, nonlinear phase errors, reducing the required frequency of training blocks.