[aravis] Re: Video Stream

  • From: Charl Wentzel <charl.wentzel@xxxxxxxxxxxxxx>
  • To: aravis@xxxxxxxxxxxxx
  • Date: Thu, 18 Aug 2016 22:50:58 +0200

On 16/08/2016 23:14, Emmanuel Pacaud wrote:

Le mar. 16 août 2016 à 22:28, Charl Wentzel <charl.wentzel@xxxxxxxxxxxxxx> a écrit :
I'm working on a project in which I'm trying to achieve this:

[camera] --(video stream)--> [ffmpeg] --(modified video stream)--> [flash server] --(rtmp)--> [web browser]

However, I'm still not where I want to be. Can anybody tell me how I get from Avaris to a video stream I can process with ffmpeg? Maybe using gstreamer somehow?

Two solutions:

- either you write a piece of software that gets images from aravis and feeds ffmpeg with them. See tests/arvcameratest.c, which just gets images, but does nothing with them.
- or you use gstreamer with the include gstreamer plugin. For example, this pipeline displays the camera video stream on screen:

gst-launch-1.0 aravissrc ! videoconvert ! xvimagesink

Thanks for the advice. I managed to find a workable solution. I have tested piping on the command line before, e.g. "myfeedprogram | ffmpeg -i - [lots of parameters]" and this worked. The problem was that I needed up to three simultaneous streams: live video, video recording and snapshots. This couldn't work because I couldn't have three instances of the same software trying to access the one camera.

The solution was simple, create a pipe, fork the process, connect output end of pipe to stdin of the child process, replace child process with ffmpeg (exec()), then feed raw frames into pipe. This can be done as many times as necessary. So I have one pogram capturing frames from the camera, but then "piping" it to 3 separate instances of ffmpeg.

Thanks again.
Charl

Other related posts: