[argyllcms] Re: Measuring display input lag with colorimeters

  • From: János, Tóth F. <janos666@xxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Mon, 26 Nov 2012 02:57:54 +0100

I wish to measure the time between [the VGA card sends a picture to the
display] and [the display shows that picture], so I can't see the
limitation here unless it's the sensor's capabilities.

With a camera, you can't tell if you took the picture right after, right
before, or in the middle of a refresh. And that's usually 16.66ms

With a software running on the source device and a light sensor attached to
that device, you know when the source device sent the picture and you see
when it appeared on the screen.
At least in theory, if the sensor if fast and accurate enought. Is it...?


2012/11/26 Graeme Gill <graeme@xxxxxxxxxxxxx>

> János, Tóth F. wrote:
> > Is this information (delay between digital and analog screen refreshes)
> > included in the verbose output?
>
> Hi,
>         yes it is. But it may vary if you run it several times,
> and it is only reading at about 10msec resolution.
>
> > Taking pictures of the displays with running clocks is also far from
> > accurate because the sampling rate is limited to the screen refresh rate,
> > so you don't even have a chance to spot a delay lower than the screen
> > refresh period (and you always get a value rounded to an even multiple of
> > the refresh period...).
>
> That's going to be the case here too - the i1d3 will only notice
> the update when the image changes.
>
> Graeme Gill.
>
>
>
>

Other related posts: