Digital signals are made up of stepped
waveforms - spaced at a fixed interval called the "timing sequence."
Receivers must detect and monitor this timing in order to accurately
read the signal. Shift in the waveform which must intrinsically
maintain a fixed interval- can be introduced by lines, causing signal
distortion. Such shifts in the timing axes of transmitted and received
signal waveforms are referred to as 'jitter.' Increasing amounts
of jitter distortion can lead to bit error, which may result in
picture deterioration or horizontal noise interference.
One tool regularly used to measure jitter is the oscilloscope.
Owing to its distinctive shape, the measured waveform is called
an "eye pattern," with the jitter expressed by the width
of the area where the rising and falling edges of the waveforms
cross each other. This jitter value is specified by the ARIB (SMPTE)
Standard shown in the companying table.