[haiku-appserver] Re: nVidia hardware known options for new features (R2?)

  • From: Stephan Assmus <superstippi@xxxxxx>
  • To: haiku-appserver@xxxxxxxxxxxxx
  • Date: Sat, 10 Dec 2005 13:56:55 +0100

Hi Rudolf,

> > thanks for the detailed email! Since drivers usually need to be
> > recompiled
> > for Haiku anyways, I see no technical reason to delay some of this
> > until R2
> > if it can be done now (given you/we have enough time). The accelerant
> > API
> > would not care about new exported calls, no?
> 
> Indeed: I was thinking about that too. Other things I'd like to add is
> saturation/intensity controls for video overlay for example ;-)

Most welcome. :-)

>  I would also export new calls for the blitting functionality. It
> > seems too cumbersome to figure out the "virtual" (off)screen location
> > of a
> > bitmap. Simply call the accelerant function and tell it the staring
> > address, rect size for source and possibly colorspace, and rect for
> > onscreen destination.
> Yeah, of course. a virtual offscreen location == a hack to keep the
> current interface :-)
> I could do this:
> -setup the scaled blit for current interface,
> -also setup a new version for it for offscreen bitmaps.

Makes 100% sense. :-) I guess for the memory manager, we could start off 
from what is currently contained in the different drivers supporting 
overlay bitmaps, and see what is common among them to have an idea for 
possible constraints.

> BTW: Anyone know if it should be working to get a connection for the
> B_YCbCr422 colorspace? Currently I can only get RGB spaces going...
> There's of course a (bigger) chance that this is because of my still
> very limited knowledge about the node subject...

It is definitely supposed to work. I know that the code should be contained 
in the stampTV code, which was once available. I think I lost my copy. 
Maybe you can get it from somewhere.

> (Is for instance a MPEG2 decoder just a node that has an input and an
> output? Would a mediaplayer connect to that node? whould the consumer
> node in turn get connected to the MPEG2 decoder? I have no idea yet how
> / if this exists... Any hints?)

Yes. That's how it would work. I'm using a node called "Decoder", which you 
need to instanciate from the dormant nodes (because it is usually not 
already active in the system), to connect to the DV decoder if there is 
one. The DV decoder doesn't support raw video, and that's why I plug in the 
Decoder node between the DV producer and my consumer. There should also be 
code for this in Cortex. It allows you to do this manually from the GUI.

> > Personally, I am on ATI, so I would not benefit from any improvements
> > you do to the nVidia accelerant for now... :-(
> 
> Well, just get a card for a desktop system.

Which desktop system? :-)

> OTOH: I am getting more and more requests to 'take over' ATI dev. ;-)

If Thomas can't maintain the driver because he has other stuff to do (did 
someone manage to contact him recently?), I guess the ATI driver would be 
in good hands if you took over. That is, if Thomas doesn't mind.

Best regards,
-Stephan


Other related posts: