[visionegg] Re: WARNINGs in the log file

  • From: Andrew Straw <andrew.straw@xxxxxxxxxxxxxxx>
  • To: visionegg@xxxxxxxxxxxxx
  • Date: Thu, 28 Nov 2002 09:03:23 +1030

Hi Joyca,

It appears that whatever is happening in the "go loop" is taking a 
significantly long time.  Try turning "synchronize swap buffers" off in 
the Vision Egg startup GUI, and also make sure it's not on by default 
in your video drivers (in nVidia's OpenGL properties section of the 
control panel). This will allow you to benchmark the base performance 
of your program.

Not seeing your code, I can only guess as to the causes of your 
program's slow performance.

If your code is the 2nd version of what I supplied, I suspect that the 
lack of speed is due to the video card not having enough memory to have 
all textures resident simultaneously.  You could try disabling 
mipmapping (send "mipmaps_enabled=0" as a parameter to your stimulus 
class constructor), although this would reduce memory usage by only 
33%. Older versions of the Vision Egg had an option to use texture 
compression, but I was afraid of its effects on image statistics for 
vision research experiments. Therefore, it wasn't a high priority to 
maintain, and I got rid of it.

A solution which I will implement in the Vision Egg is a "pseudo-blit" 
whereby any image or numeric array in system RAM can be copied into an 
(already resident) texture.  In your case, only one texture would need 
to be resident on the video card at a time, thus greatly reducing the 
video memory requirements. Also, this ability is very useful in a 
number of other situations.  I'll make an announcement on the mailing 
list when that code is incorporated.

In the meantime, would it be possible to get a video card with more 


On Thursday, November 28, 2002, at 02:39  AM, Joyca Lacroix wrote:

> Dear Andrew,
> Thank you for the code you sent me last week! I have a question 
> concerning
> the use of this code (see below, second code part). When I run the 
> program
> with this code and set the duration_per_image at 0.1 sec. I can see 
> that the
> duration of display differs per image. Also, several WARNINGs occur in 
> the
> VisionEgg log file (see here):
> My videocard is nVidea GeForce MX 100/200 and the refresh rate is set 
> at 100
> herz. I work with windows 2000
> Could you please explain why I still get these warnings.

The Vision Egg mailing list
Archives: http://www.freelists.org/archives/visionegg
Website: http://www.visionegg.org/mailinglist.html

Other related posts: