[haiku-development] Re: MediaPlayer and the Media Kit

  • From: Stephan Aßmus <superstippi@xxxxxx>
  • To: haiku-development@xxxxxxxxxxxxx
  • Date: Tue, 27 Nov 2012 11:31:51 +0100

Hi Julian,

Am 26.11.2012 23:57, schrieb Julian Harnath:
Since I'd like to do some work on MediaPlayer, I've been reading into
its source code and into the depths of the Media Kit in general. While
doing this, a few questions about MediaPlayer in particular and the
Media Kit in general arose.

About Media Player:

* Why does it bring its own classes for audio mixdown/resampling/format
conversion and doesn't leave these things to the system audio mixer?

The simple reason is that the framework stems from Clockwerk (and originally eXposer). In those apps, there would be a single audio media node link to the system mixer, and within the application, a multitude of different files were to be mixed to the format of the node connection.

If you look a bit closer, you should see that it actually doesn't use its own remixing capabilities and connects to the system mixer with the native format of the file it is playing. I hope that's correct, I have a faint memory to have implemented that.


* Even if these things should be done internally for some reason,
wouldn't it make more sense to implement them as Media Kit filter nodes
instead of having its own interfaces for defining a media graph
(AudioReader etc.)?

I don't know if it's still used in some fall back case, but if not, the code could perhaps even be removed. I would just keep it, though, its design isn't bad. Maybe even better than what's used in the system mixer and it could be exchanged.

* In the same way, wouldn't it make sense to convert the subtitle
renderer into a video filter node, then other applications could benefit
from it, too?

I am wondering about the performance hit that this would mean. In particular, I remember having optimized this heavily, so slower computers still manage to render subtitles even in full-screen. Compositing the entire bitmap one more time, transfering it to another node, would probably make the process too slow, certainly at least introducing more latency.

About the Media Kit:

* There seem to be two kind of Media Kit add-ons: the .media_addons and
the plug-ins. Why the separation and did this separation already exist
in BeOS? (I'm asking because I can't find anything about the plug-ins
in the BeBook and I currently don't have a BeOS install to check). It
seems to me the plug-ins simply do the same thing which would be done
by an add-on node implementing BFileInterface, which seems like a not-
so-nice redundancy.

Yes, in BeOS the separation was the same. But the plugin API for codecs was not documented. The design and separation can stay as they are, if I am not missing anything. Just the naming may be confusing. In particular, "add-ons" could depend on the low-level codec "plug-ins". For example a Decoder add-on which implements a Filter Node may offer decoding all formats available via the low-level codec plugins.

* On the topic of BFileInterface: there are currently no nodes shipping
with normal Haiku builds at all that implement it, so BMediaRoster::
SniffRef() can never find anything. The BeBook notes in the Media Kit
introduction that BMediaRoster::SniffRef() can be used to get the nodes
which would be used by the more convenient BMediaFile/BMediaTrack
classes. In Haiku, these classes use the plug-ins though instead of
nodes (this discrepance leads me to suspect that these plug-ins didn't
exist in BeOS)...

A file reading node should be implemented. But note that a node framework is simply more computationally expensive than using plugins directly like MediaPlayer does. And it really does count.

Best regards,
-Stephan


Other related posts: