[conspire] supported graphics cards

Rick Moen rick at linuxmafia.com
Sun Jun 30 09:59:24 PDT 2013


Quoting Edmund Biow (biow at sbcglobal.net):

> I think maybe luck may have had something to do with it, maybe the open
> source is with you, Rick.  Unfortunately I have found that it often
> takes quite a bit more than a year for the open source driver to make
> hardware fully functional.

People were telling me that in 2005-6, when I was in sole charge of
certifying all new computers hardware for Linux and Solaris purposes at
Cadence Design Systems, the largest EDA company.  We got new laptop,
server, and workstation model samples drop-shipped directly to my lab at
Cadence for testing on RHEL3 and 4.  (The Solaris testing was less
broad-based as to vendors; we mostly stuck with Sun-branded gear.)

Any given week, I likely had something new from HP, Lenovo, IBM, Sun, or
Dell to test.  Cadence's dumb engineering rules precluded running newer
kernels or X11 software thah what was provided in Cadence-certified
release levels of the operating systems.  Some of those were
ridiculously ancient, e.g., RHEL3 Update 1.  I got really good at
playing with kernel-loading parameters like those relating to ioapic,
acpi, lapic, apm, etc.  to coax new hardware into satisfactorily running
old distributions.

You know the one thing I just _never_ had to do?  Right, even though I
was never permitted to retrofit newer kernels or X server software, I
absolutely never was unable to get a satisfactory X setup going.  Oh,
actually I just remembered the one exception.  It was a Dell workstation
with all-Nvidia chipsets.  But that was a telling exception, because it
was basically cheap crud, and Cadence ordinarily never even evaluated
cheap crud.

Mind you, there were units that wouldn't run some of the older
Cadecne-certified RHEL releases, such as the aforementioned RHEL3 Update
1, regardless of how creative I got with tricks like 'pci=noacpi' and
such.  However, with that one exception, every single laptop, server,
and workstation ran most of the stock 100% open-source builds.

And the running theme was: no cut-rate crud.

In short, the people telling me that in 2005-6 were wrong.

You're now telling me the same thing that people were telling me in
2005-6.  I'm thinking it's probably still wrong.

Of course, there is always problematic hardware.  Let's see if we can
spot the signs of likely trouble.  Could it be newly introduced
chipsets?  Extreme cut-rate, peculiar design?  Both?

Any side-bets, people?  Last chance.  Going, going....




> For instance I bought a couple of AMD Zacate E350 very low powered
> CPU/mobos in April, 2011, a few months after they came out. 

You know, I am strongly in favour of the newer low-power designs as a
category, but there are things about that one that immediately pop out
to me, and the big one is AMD 350 Zacate was a brand-new mid-2011
chipset with integrated GPU that was aimed squarely at budget notebooks.  

> One dual boots Ubuntu 13.04 with the proprietary FGLRX and Debian
> testing with the open source radeon driver, the other has Debian
> stable. The first few months were really a PITA, especially on Debian.
> Sound didn't work at all in Debian until I upgraded to the Liquorix
> kernel, video was crappy, I couldn't log out without the screen going
> black, I couldn't get in to a TTY, the screen would just become black
> until I rebooted using the Magic SysRq commands. In Ubuntu early on
> even the proprietary driver wasn't great, I couldn't play 1080p
> content in 11.10. But 12.04 1080p would play, but wiZth artifacts and
> choppiness. FGLRX was mostly fine for 1080p in 12.10, and now it works
> fine after upgrading to 13.04.  The Debian testing install is able to
> play 1080p now that the post-Squeeze release freeze has broken up
> using the 3.9.1-amd kernel, but with artifacts and tearing. The radeon
> driver worked OK using the liquorix kernel on the Debian stable system
> in Squeeze (no HD video, of course). But after I upgraded the system
> to Wheezy (current stable) I got a second monitor and I couldn't get
> it to work with the open source driver. I know all about xrandr
> commands, how to craft a xorg.conf, how to create modelines, but it
> just wouldn't work until I installed the proprietary driver.  At this
> point the system is over 2 years old and using the open source driver
> is still marginal.

You forgot to mention that suspend doesn't work with the proprietary
fglrx drivers.  Even with the secret-sauce help of AMD engineers, the
whole thing's buggy.  Because the hardware is buggy _and_ the
proprietary video drivers and their prioprietary kernel shim drivers are
_extra_ buggy.  Thus my point upthread abount why the LKML engineers
were so driven to distraction by misfiled kernel bug reports traceable
to crap proprietary video drivers that they invented a whole 'taint'
system for the Linux kernel so they could track where the breakage was
coming from and say 'No, not our problem.  If you insist on running crap
code from AMD|Nvidia, take your bugs to them, not us.'

Some hardware just isn't a great choice, and apparently a lot of people
bought these units thinking they'd make great low-power DVR units
running XMBC, and were unpleasantly surprised by the consequences of
them being cheap and peculiar hardware.  

FWIW, it's reported that the unsupported beta builds of OpenELEC (_not_
running fglrx), a leading XBMC distribution, have for two years given
excellent results wih AMD E350 Zacate, so that in turn suggests what
your best non-fglrx solution should be:  leading-edge X.org software.


> I'm not a gamer, I just want my system to play HD video and gracefully
> do dual monitors. Many desktop environments are expecting more and more
> out of video chipsets, particularly Unity and Gnome 3, both of which are
> periodically threatening to eliminate 2D fallback mode.

Yeah, good thing for me that I really don't care about that shit -- but
that, if I _did_ care about that shit, I'd not just buy newly introduced
extreme-budget, peculiar hardware and only worry about driver support
after paying my money.

> This brings up another difference between ATI & Nvidia. Nvidia supports
> its older hardware with its proprietary driver for a relatively long
> time.

I notice that people continue to use the proprietary driver out of
inertia long after it's no longer necessary.  One of my several points
to people is to warn them about that syndrome, to remind them that the
'relatively long time' of which you speak runs heavily into the period
when you can do away with that crutch.

> I really want to support AMD/ATI because I don't want there to be an
> Intel monopoly for desktop CPUs, but getting the stuff to work properly
> can be a challenge.

Frankly, Intel tends to do such a better job of all-around functionality 
wihout the need for secret sauce that I'm relatively apathetic towards
AMD in that regard.  Screw 'em.  Yes, I don't like an Intel monopoly
either, but this other crud is not a reasonable alternative.

My former employer VA Linux Systems has a lot to answer for in its
substantial support for Nvidia in 2000, IIRC assigning Mark Vojkovich
to develop proprietary OpenGL infrastructure for their video cards, and
thus launching the whole current mania for gamer-focussed proprietary
video and whizzing-windows compositing 3D desktop environments.  

Yeah, suddenly it's a perceived problem if Canonical's Unity DE doesn't
run on newly released extreme-budget notebook chipsets without either
cutting-edge software upgrades or proprietary drivers?  Wow, Edward.
This is me just not caring very much.





More information about the conspire mailing list