Enhancing Nutt-Based Time-to-Digital Converter Performance with Internal Systematic Averaging

A time-to-digital converter (TDC) often consists of sophisticated, multilevel, sub-gate delay structures, when time intervals need to be measured precisely. The resolution improvement is rewarding until integral nonlinearity (INL) and random jitter begin to limit the measurement performance. INL can then be minimized with calibration techniques and result postprocessing. A TDC architecture based on a counter and timing signal interpolation (the Nutt method) makes it possible to measure long time intervals precisely. It also offers an effective means of improving the precision by averaging. Traditional averaging, however, demands several successive measurements, which increases the measurement time and power consumption. It is shown here that by using several interpolators that are sampled homogeneously over the clock period, the effects of limited resolution, interpolation nonlinearities and random noise can be markedly reduced. The designed CMOS TDC utilizing internal systematic sampling technique achieves 3.0ps rms single-shot precision without any additional calibration or nonlinearity correction.