Total Jitter Measurement at Low Probability Levels, Using Optimized BERT Scan Method

应用文章

Jitter, in the context of high-speed digital data transmission, is usually defined as the deviation of the decision threshold crossing time of a digital signal from its ideal value. Jitter describes a timing uncertainty, and therefore has to be considered when we look at the timing budget of a design. In one sense, jitter is just another component that makes part of the bit period unusable for sampling, just like setup–and hold times. However, unlike setup- and hold times that are usually thoroughly specified for logic families and can be taken from data sheets, jitter is a function of the design and has to be measured.