[visionegg] Re: making a stimulus flicker at a frequency

  • From: Andrew Straw <andrew.straw@xxxxxxxxxxxxxxx>
  • To: visionegg@xxxxxxxxxxxxx
  • Date: Tue, 12 Aug 2003 12:08:29 +0200

Hi Andrew,

I've just moved from a 667 Mhz down to a 500 Mhz machine, and the controller I wrote to flicker the
stimulus is starting to stutter. It flickers just fine, and then does just a little noticable hiccup about once
a second (I have a feeling that (once a second) should tell me something, but it's eluding me).

I wrote a controller like so:

def on_or_off(t):
        return(   int(   (t-int(t)) * 16.0   )      % 2)

which returns either a 1 or 0. [*footnote]

(The controller proper being something like:

flicker_controller = FunctionController(during_go_func =on_or_off )


I'm feeling this is a bit of a kludge, but I haven't thought of any alternatives.

I'm trying to figure out if there's a more accurate or efficient way of flickering a stimulus at a particular
frequency than this (a way which still uses the Presentation class). Any thoughts and/or pointers in the
right direction?



(** footnote: If you want to know, this strips off the second from the time (t-int(t)) ( ie. 2.223242 s ->
0.223242 ) and then uses the resulting decimal fraction to create a number between 0 and 16. The
mod tells us whether the int of that number between 0 and 16 is even or odd, which should return
evenly temporally spaced 0's and 1's -- eight 1's and eight 0's in a second. At least, that's the idea. )

Hi Mark,

A few points:

1) Your suspicion about your flicker function seems to be worth investigating. I think it there may be wrap around effects of your subtraction. Perhaps try something simpler:

flicker_rate = 16 # cycles per second
def flicker(t):
        return int(t*flicker_rate*2.0) % 2

2) Synchronizing buffer swaps with vsync does not work on OS X in non-fullscreen mode (OS X issue with attempt at workaround in SDL). Maybe you're getting tearing?

3) Have you tried playing with raising priority, particularly by using "realtime task method" and setting the realtime period denominator to be exactly your frame rate, and checking "do not preempt"? This ensures your script gets the CPU once per frame.

Note that only suggestion #3 really has anything to do with CPU speed. What does the frame count histogram indicate? Are you skipping frames?


The Vision Egg mailing list
Archives: //www.freelists.org/archives/visionegg
Website: http://www.visionegg.org/mailinglist.html

Other related posts: