[haiku-development] Re: nvidia driver trunk to alpha sync needed...

  • From: Rudolf Cornelissen <rudolf.cornelissen@xxxxxxxxx>
  • To: haiku-development@xxxxxxxxxxxxx
  • Date: Wed, 9 Sep 2009 06:56:25 +0000

Hi Andre,
This is not quite what I had in mind when I wrote my mail. The intention of
my mail was to indicate that the current alpha version isn't a good choice
to keep as it is: is should either be upgraded to the current trunk version,
or downgraded to a version I would have to lookup. If the latter case would
be happening I'd like to know why since I did not get bug reports indicating
the current trunk version is faulty. (it's not).

Anyhow: to answer your questions:

a/ no. It's a lot of work to setup 'just' that. I don't have time to do it,
certainly not in a few days(!). Currently there's no functional code in that
driver yet, since I am concentrating on the pre-NV80 driver.
BTW: acceleration isn't used at all currently, so you won't miss it if it
isn't there, even on a real driver.

b/ no.  I would say this looks a lot like (a). Programming the ramdac *and*
programming crtc is what's needed. The only thing lacking for a full (a) is
initing just enough stuff to not need a vesa 'preset'. Which amounts to
(almost) nothing if all is right.

c/ it's not doable. For modelines to work you indeed need a working driver.
A custom modeline is just a custom programming instruction for the CRTC/DAC
and needs (a).

d/ gallium3D needs a 2D driver beneath it to function. In our case this will
be the nouveau project driver, which I am BTW looking at a lot these days
and the latest updates in the 2D driver come from there. (XFree86 is pretty
much frozen these days, X.org I could still look at but this will probably
switch over to nouveau if not already done.).

I wouldn't call it a single function BTW: a gfx driver consists of what you
ask for, and acceleration. The hard part of acceleration is getting the
engine inited (cost me four full time working months back then on the
pre-GF8xxx driver). Once init is done, (old BeOS and Xfree 2D) accel
functions for 2D are a relative breeze.


And now for your small request. If you want dithering on a GF8xxx then I
can't do it since there's no driver yet. If you want it for older cards,
supported in the current driver, then please post a bug-report and we
continue there.

Thanks,

Rudolf.


2009/9/8 André Braga <meianoite@xxxxxxxxx>

> Em 08/09/2009, às 18:33, Rudolf Cornelissen <rudolf.cornelissen@xxxxxxxx
> m> escreveu:
> > Can someone update it?
>
> On that note: since the GeForce 8xxx+ series there's no accelerated 2D
> interface for the "GPGPU" cards. No biggie since VESA is quite fast
> under Haiku... except that VESA is hardcoded to a set of resolutions
> and refresh rates.
>
> In a world of LCD panels being predominant this is once again a non-
> issue. But for the rest of the world where the majority of computer
> monitors are comprised of CRTs, this *sucks*.
>
> So here are my questions:
>
> a) is it possible to enable just enough of the GPGPU driver to allow
> changing the resolution and refresh rates to match the EDID instead of
> the VESA ROM, and then keep using the VESA drawing routines while the
> proper acceleration interfaces aren't ready?
> b) going the opposite way, is it possible to override VESA timings
> with GTF ones, programming the RAMDAC directly, like on X11 with
> modelines?
> c) is (b) doable before the alpha1 deadline? Is this doable at all? Am
> I mixing things up with the X11 modelines and in fact I always had X
> loading the right drivers behind my back?
> d) is it possible to lift just enough of the Gallium3D or traditional
> Xorg drivers to enable this single function?
>
> Now, a small request with nothing to do with timing or refresh rates:
> would you please enable dithering if a flatpanel is detected? Or at
> least make it a config file option? I intended to do this myself and
> even found the places in the xorg driver that deals with that, but the
> Haiku video card state struct layout differs from the xorg one quite a
> bit and the offsets are quite different between them, and I wasn't at
> all sure of what address to poke in order to turn on that bit, and to
> be honest quite scared to miscalculate something, flip the wrong bit
> and blow up the GPU or the RAM or what else have you that could go up
> in smoke and flames. O_O
> The laptop in question isn't even mine, and even if I owned one myself
> I'd be scared to death to mess with such sensitive hardware directly.
> On the desktop the videocard is pretty much disposable, but it's a
> GeForce 8400 (which the driver can't yet... drive) and the monitor is
> a CRT (double-yikes).
>
> You, on the other hand, are, of course, better prepared to do this,
> blindfolded even :)
>
>
> Thanks a bunch!
> A.
>
>


Other related posts: