When performing A/D conversion on image signals, when reducing noise that
is caused by jitter by adjusting the phase of the sampling clocks, even
if the input waveform has considerable waveform distortion such as a
triangular wave, it is possible to reliably reduce this noise. Input
analog image signals are converted into digital image data using sampling
clocks from a PLL circuit by A/D conversion means. Next, image data that
has delayed by a 1 clock delay circuit is subtracted from the digital
data by a subtracter. The maximum value of one screen of the subtracted
output is then determined, and 5 is subtracted therefrom to provide a
threshold value. A comparator compares the subtracted output and the
threshold value, and outputs a signal when the subtracted output is
greater than the threshold value. A counter then supplies the count value
of these signals to a CPU, and the CPU controls the phases of the
sampling clocks using a switch. The CPU is thus able to control the
frequency characteristics of the input image signals by switching a
switch of a frequency correction circuit in accordance with the count
value.