[visionegg] Re: swap_buffers() returns immediately for 2D Textures in OpenGL 2.0

  • From: "Sol Simpson" <sol@xxxxxxxxxxxxxxx>
  • To: <visionegg@xxxxxxxxxxxxx>
  • Date: Mon, 28 May 2007 15:38:40 -0400

This is similar to what we have always found as well, but this may be from
us always using opengl 2.0 (which is an assumption I am making here). I
posted our findings Jan 29th 2006 subject "OpenGL latency" because it
explained to us why some people had reported a mysterious 1 frame / retrace
extra delay when testing with a light key.

We have tried windows, OSX, and Linux. On Windows we have tried both nVidia
and ATI cards. All showed the same pattern.

Thanks,

Sol 

-----Original Message-----
From: visionegg-bounce@xxxxxxxxxxxxx [mailto:visionegg-bounce@xxxxxxxxxxxxx]
On Behalf Of Martin Spacek
Sent: May 28, 2007 3:27 PM
To: visionegg@xxxxxxxxxxxxx
Subject: [visionegg] swap_buffers() returns immediately for 2D Textures in
OpenGL 2.0

Hello,

With VISIONEGG_SYNC_SWAP and VISIONEGG_DOUBLE_BUFFER both turned on and
apparently working, the swap_buffers() call should wait until the next 
vertical sync before returning. This was always the case on older 
hardware (ATI Radeon 9800) with drivers that reported up to OpenGL 1.5 
capability.

Now we have new hardware (Nvidia 7600 GS PCI-Express) with drivers that
report OpenGL 2.0 capability. When dealing with a 2D texture with these
drivers, the swap_buffers() call returns immediately, and the *next* gl
call of any kind that follows waits until the next vsync before
returning. It doesn't seem to matter what this next gl call is
(gl.glFlush(), or gl.glGetString(), or anything). So as a work around,
immediately following a swap_buffers() call for a viewport with a 2d
texture in it, I've simply added a gl.glFlush() call.

This doesn't seem to happen for Target or Gratings stimuli, even though
Gratings is a Texture, albeit only a 1d texture. For these stimuli,
swap_buffers() waits for the vsync before returning, as it should.

Does this sound like a driver bug, or is this part of the OpenGL 2.0
spec? I've tried the oldest and the newest (winxp Forceware 94.24) 
nvidia drivers for our card, with no change. Completely uninstalling 
them reverts visionegg to using Microsoft's OpenGL 1.1.0 GDI generic 
drivers, which don't exhibit the problem, but are otherwise obviously 
undesirable. I've tried PyOpenGL 2.0.x, 3.0.x, and Pygame 1.7 and 1.8, 
with no change. I'm using the latest svn visionegg. Also, whether I use 
texture data from a numpy, Numeric, or numarray array makes no
difference.

Is it possible to somehow make visionegg/PyOpenGL limit itself to OpenGL
1.5?

I've attached a test script. It would help if anyone with OpenGL 2.0 or 
greater drivers, and/or anyone with recent nvidia drivers, could try 
running it and report the results.

Cheers,

Martin

======================================
The Vision Egg mailing list
Archives: //www.freelists.org/archives/visionegg
Website: http://www.visionegg.org/mailinglist.html

Other related posts: