I believe there is a significant performance issue with the way aravissrc
is handling the processing of Gigabit packets when there is more than one
camera streaming images simultaneously. I think this is preventing the
cameras/network from operating well below their maximum capability. There
have been a few posts in the past that mentioned this and it always
dismissed as as bandwidth issue. But, I don't believe that's the case.
Our team has two applications. The first uses 6 USB BlackFly Cameras from
Point Grey for a spherical cameras system. The cameras are configured as
follows: 1920x1200, Grey8, and at 10Hz. We are able to use some fairly
elaborate pipelines (encoding h.264, using a compositor, outputting
compositor image on network) with hardly any issues. Video is smooth and
video displays well when viewed locally on computer.
As a comparison, the second application uses two Blackfly Gigabit camera
from Point Grey. The cameras are configured as follows: 1920x1200, Grey8
and at 5Hz. This application is operating well within the bandwidth
requirements for gigabit network/camera system and I'm seeing ~70+ packet
resends errors per second spread between the two cameras. Most packet
resends are requesting more than one Ethernet packet which I think is
causing significant performance issue when how the video is encoded and
displayed locally. I used wireshark to get the capture and confirm the
In both scenarios, the same processor is being used. Also, both
applications the cameras are being simultaneously externally trigger.
I also ran some other scenarios where I reduced the image size (600x480)
kept the same frame rate. This scenario worked great, but as I recall from
memory I saw only using a few Megabytes of bandwidth on network.
If you got any suggestions, I would appreciate it.