Calibration methodologies for multi-giga sample per second ADCs
MetadataShow full item record
Time-interleaving Analog-to-Digital Converters allows for increased sampling rates at the cost of added signal distortion. This article considers a system where Timing Jitter is counteracted by a Trace and Hold module, and investigates existing calibration methods for Offset and Gain mismatch which either make use of an additive calibration sequence, calculates and compares subchannel energy, or bandlimits the input signal. Certain modifications are made to each of the algorithms allowing more versatile systems, or higher performance.The performance of a calibration method will be judged by it's average estimation error, possible worst-case outcomes and their impact on the system's stability. A number of modifications which aim to potentially reduce the Mean Square Error or improve the estimator's diversity are introduced. The various systems are tested using both white noise input, and inputs containing several sinusoidal components.The numerical simulations illustrate some of the difficulties involved in designing a blind or partially blind Gain calibration system, compared to one relying on a training signal. For the method using a training signal, an algorithm designed to resemble a Maximum A Posteriori estimator is shown to offer increased performance. The modifications to the Gain calibration method are shown to be moderately beneficial, depending on the situation. The calibration system which relies on oversampling is modified so it can be expanded to calibrate Gain mismatch for any even number of ADCs, which is shown to work under certain circumstances.