Why is this change needed?
Things like hw acceleration a-la-stagefright, and audio over IP doesn't
belong to the media_kit.
audio over IP may be implemented as a media node, right?
And acceleration as a media plugin just like ffmpeg? Or most likely we
would use ffmpeg for that?
Additionally, the codec kit will be low level in the sense that it will
take care of formats and conversions, where the media2 will not have to
care about that except respecting the constraints of the codec kit.
Ok, this I can understand. I guess it means a "generic" media kit
(handling either completely generic "streams" without caring what is
transported, or handling audio and video in uncompressed format and a
single framerate and colorspace?). And a separate codec kit to take care
of encoding and decoding. This may make sense but I don't know without
more thinking about the design. It would be great if you could share
your vision here so we all see where you want to go.
library?
What are the technical reasons for splitting them in a separate
stated
Should I read "splitting them in a separate kit"? If not, I already
in a previous post that I may in the end use the same library for thecodec
and media2 kit.
My question stands then. Why are they now separate libraries? I fear
that things previously compiled against libmedia may now encounter
undefined symbols. I may be overcautious with this. I can live with it
if there is a valid reason for making a separate library.