# New video card?

## delta407

I'm seriously considering upgrading my video card from an older (but decent) ATI card (Radeon 7000/Radeon VE, to be precise). As I have fully switched to Gentoo, what has the best/easiest/fastest support under Linux? Sharing of opinions, experiences, and advice is most welcome.

Also, I don't need killer graphics, though it would always be nice. Games like Continuum which have minimal system requirements run like crap under VMware. (I get 30 on my dual 1.0 GHz under Gentoo and 45 FPS on a P90 under '95. It's sad, really.  :Very Happy: ) I'm aiming for the $100 price range or so, since I don't really need it and I figure a hundred bucks ought to get me decent performance.

----------

## rommel

dont you think a geforce would be an easier card to use under linux adn it will certainly outperform an ATi card...i just installed the card you want to buy in a system i build for someone...but it will be running winXPpro....the card is good but i think i would use an nvidia product just for the ease of driver setup...hopefully ATi will get it together in the near future...sounds like it anyway

----------

## jtmace

Nvidia and Nvidia alone..   

No other v/c manufacturers support linux like Nvidia.. I get screaming performancance out of mine..  Loading Nvidias custom kernel and GLX modules is painless..  

Please get one.

----------

## Hamshrew

ATI is making a lot of progress... still, Nvidia is ahead in most respects.  But if you're using SMP, you might want to go with an ATI... I've heard of some driver issues with the nvidia cards.

Then again, I've been using 44 dual athlons with Geforce 3's for months now, with now ill effects...

----------

## pjp

I too am considering replacing my ATI card (Rage of some variety).  I was looking over my

startx output and noticed some errors.  In looking for answers, I've discovered ATI support 

is "pitiful".  However, I've read quite a few problems with Nvidia cards as well.  Are there any 

glaring problems with Nvidia still?  The Nvidia doc mentions "AGPGART vs. NVAGP".  The doc 

suggests that neither of these work in AGP mode.  Is this true, or just with certain MB 

chipsets (if I understood correctly)?

Also, $100 is about $100 more than I want to spend  :Very Happy: .  Seriously though, I don't do any 

3D gaming, but would like to have some ability just in case.  Plus, the prospect of using 

evas with e17 is interesting.  What are some low-to-mid range, but worthwhile, 3D Nvidia 

cards?

Thanks for your input.

----------

## Dolio

To Kanuslupus:

A quick look at Newegg.com (which from what I've seen usually has the best prices for an online vendor who gets good marks for reliability), you can get a GeForce2 GTS for $45 (which is pretty old, I guess, although I think it's the highest end GeForce2), and a GeForce4 MX 440 for about $90 (which is like a really big GeForce2. No pixel shaders or anything fancy like that. As I recall GeForce3's beat or tie them at a lot of stuff).

It looks like anything with a pixel/vertex shader is more than $100 ($130 for a low-end GF3, $160 for a low-end GF4 TI). I don't know if a GeForce2 will be better than yours, but the GeForce4 MX might be (I've not done much ATI research).

I'm currently using a GeForce3 (back before they came out with the Titanium gimmick), along with an ALi Magic DDR chipset (IWill KA266-R, one of the really old DDR mobos (geeze, really old and it's barely been a year)), and I'm using agpgart without having any problems. I think my average FPS for the evas_test was somewhere around 300 or 400 (Athlon TBird 1.4 GHz + 768 MB PC2100 DDR, since that matters too), and I've had no problems running any 3d games (Q3A, UT, Jedi Knight, etc) at 1280x1024x32.  I'm not sure how extensively pixel shaders are used in all those (I'd guess not terribly), so you might get similar performance from the $90 GF4 MX 440 (with similarities in the other hardware, of course).

Anyhow, that's my experience. Sorry for the long post. Hope it helps.

EDIT: Oh, I forgot one thing. If you go with the GeForce4 TI 4200 (which is about $70 more than the $100 more than you want to spend), there's two types, and I seem to recall that the 64 MB card gets higher scores than the 128 MB card, because the GPU is clocked higher, so you should probably check that out before you buy either one (and don't just take my word for it, because I could have it backwards). Also, the 64 MB one is less, I think, so it's tricky.  :Smile: 

----------

## pjp

Thanks... I usually end up spending more than I budget anyway.

----------

## smtanner

I would recommend getting the grforce2 gts from newegg for 45 bucks.  This is still a very good card (I am currently using one).  Just go to newegg and read the reviews on this card.  Easily the best value you can get.  I have never had any problems with this card on linux.  I play tribes2, quake3,  and ut without problems.

----------

## pjp

I'll give it a look.  I'm not a big gamer, so it probably won't matter, but how are your frame rates etc.?

I'm considering Neverwinter Nights and possibly Warcraft 3.

----------

## smtanner

I just ran a quick test in quake 3.  At 1024x768 with all detail maximized, I get 91.7 fps average.

----------

## ColdPack

I was looking through the forums for some video card suggestions because I am sick and tired of crashes in kde.

I have a GeForce 2 MX pci card and no matter what I do (there's a hundred suggestions in these forums on this exact problem) nothing works.

So I was thinking of ditching Nvidia because I've never had these problems with other video cards.

What am I to do?

CP

----------

## phypor

I have one at home ... and it gives 90+ fps in q3a and 60+ fps in urban terror with max quality (but not aa)

i also have a hercules kryo 2 (64 MB) that does not play well in linux (altho there are up n' coming drivers)... if you see one of these guys cheap, don't get it unless you really do wanna hack to get your 3d working...

btw... the 200 MX is supposed to be the lowest end of the latest geforce models...

but linux reports that it has DDR ... (which is generally considered much better than SDR on a vidcard)... basically, get a 64 meg card for sure, and DDR if possible

on the whole, the gforce2 is a kickass card for 

price

linux compatiblity

performance

~phypor

----------

## delta407

Oh, BTW, I went out and got a GeForce4 MX440 (64MB DDR). Though, I still have three weeks left on the return policy, so I might upgrade to a Ti 4400.  :Wink: 

----------

## trythil

I've been looking at the Matrox Parhelia to replace this GF2 GTS (oh no, I get 130 FPS instead of 160 FPS in Quake 3, God help me), but not sure what the Linux driver status on that card is.  Matrox has been very good to Linux users in the past, and indeed one of the "supported operating systems" on the Parhelia tech sheet is Linux, but does anybody know of the status of OpenGL-accelerated drivers for it?

Because that's looking like a mighty fine card, despite what the framerate freaks say, and might be a good replacement for anybody here.  :Smile: 

----------

## delta407

I can't say about Matrox, but I have to say that nVidia cards (even a GeForce DDR from way back in the day) give both better performance and better quality than ATi cards. The ATi card I had in was only about a year old and wasn't exactly cheap at the time, and things like stencil buffering just flat out suck. I'm putting the way older GeForce DDR back in, and either retiring or ebaying the ATi.

Also, the ATi card gives about 120 FPS in Continuum, whereas the much older GeForce DDR never falls below 500. I've learned my lession.  :Wink: 

----------

## pjp

 :Shocked:  2 Parahelia cards on newegg.com, one for $350, the other $399.  The GF2 GTS (on newegg) is $89.

What does the Parahelia do other than have a (marginally) lower frame rate?

----------

## delta407

 *kanuslupus wrote:*   

> What does the Parahelia do other than have a (marginally) lower frame rate?

 

Lighten your wallet?

----------

## Swishy

Im currently running an old Matrox G400 16mb and have found I can still run most Games playably and the image quality is better than comparable GForce 2 's although the framerate is no where near it , but due to the ease of installation and image quality I think ill hang out for the Pahelia once the budget can afford it.

Although the clock speed is down on the GF4 I think the 256bit bus and 20GB/S throughput will compensate , oh and 16x antialiasing...  :Wink: 

oh yeah has a 512bit GPU

----------

## mglauche

one thing about matrox: you have to be patient. yes. Their drivers for the G400 rock, but .. i did buy a g400 right after it did come out, and back then the drivers (both win32 and linux) were in very bad shape. Matrox is not the fastest with drivers, but if you are patient, its ok, they still update the g400 drivers ... 

Also keep in mind they only opened their g400 doc's. The new card may be quite different, so it might take some time untill drivers pop up. (usually matrox does not code themself for non win32 platforms, unlike nvidia, but open their hardware docs)

----------

## 3x9

Delta,  pls keep us informed about your conclusions w/ your new vid card.

I was certain  the menuconfig  mentions Nvidia was problematic if used in SMP ?

Do you  not use dual - (yuk)  socket370 ?   (Try  dual  K7 or even Duron, believe it or not)

Hope this is not a new subject, but has anyone had success w/ Radeon VE 32 DDR ?

The only way it was partially useable was in non-accell  frame buffer mode.

I bought it, before other than Matrox had  single card-dual display capabilities.

Ok for Win$$ but finally had to remove it, put in an old  S3-Virge.

 Radeon VE was untenable, w/constant  freezes when   resizing a window or  dropping out of GUI & into terminal mode. 

 Xfree  org  states support for ATI, but in  useage it  does not exist for  this  (dual-head display card)   No games, nothing to stress, it just will not  co-exist under Linux .

Am considering (Abit or Aopen)  Geforce4 MMX 420/440 64DDR  ?

Any comments  about  this  in Linux ? 

Thnx  - any advice appreciated .

----------

## delta407

Well, my first conclusion is that it is only marginally faster than a GeForce2, and for that reason I will return it in favor of a GeForce 4 Ti 4400, which is about double the price but is about three times the card. The MX440 works great under Linux on my dual 370, no performance or compatibility problems at all. (And BTW, I'm running dual PIIIs because it was cheap compared to duron solutions at the time.)

As far as branding on nVidia cards go, it's the same chipsets that use the same drivers; just pick a brand you're happy with and everything should be fine. The GeForce4 MX 440 I bought is happy. (Well, with my hardware, at least.) It's a little underpowered for $140 (the GeForce3 MX has more features than the GeForce 4 MX), which is why I'm going to get a better one, but it works fine.

----------

## Swishy

 *mglauche wrote:*   

> one thing about matrox: you have to be patient. yes. Their drivers for the G400 rock, but .. i did buy a g400 right after it did come out, and back then the drivers (both win32 and linux) were in very bad shape. Matrox is not the fastest with drivers, but if you are patient, its ok, they still update the g400 drivers ... 
> 
> Also keep in mind they only opened their g400 doc's. The new card may be quite different, so it might take some time untill drivers pop up. (usually matrox does not code themself for non win32 platforms, unlike nvidia, but open their hardware docs)

 

Yeah I see coming soon is the release date for the Parhelia linux drivers on thier site

 :Wink: 

----------

## Xor

well, my point is, that nVidia produces BS-drivers.... memory leaking and unstable.... but fast - admitted.

the new ATIs 9x00 do outperform any GForce... and ATI has as far as I know better support for the community.... (better ... not perfect... like matrox  :Smile: 

to make it clear... I would sacrifice those 5% performence when I have the possibility to get a  :Cool:  - ATI

----------

## g00se

Has anyone tried dualhead with either ATI or NVIDIA card? How's Xinerama working and how about GLX support and video overlays?

I'm currently using Matrox G450 with merged Xinerama (it's some trick in Matrox drivers) so I'm albe to get 3D acceleration on both screens. Xvideo is working on screen 1 only though. Oh, and how's DVI support working? I'd love to use it with my flatplanel.

While this Matrox has nice features and everything is working great, the 3D speed just isn't there. And Parhelia has no Linux support yet.

----------

## Hypnos

FWIW, I have an ATI Radeon 7500 32MB on my laptop (1.6GHz P4 w/ 512MB RAM), and I'm playing Quake 3 at 60-90 FPS (heavy to clear action) at 1024x768 highest quality under X with the latest drivers from the DRI project.  Ironically, it actually renders better under Linux/X than in Windows, where the driver seems to have issues.  The rendering is a little rough around the edges, but seems comparable to NVidia under Linux/X.  (I would not claim to be a trained eye, however.)

Having an open source driver is useful also if you have to mess with the kernel a lot, as I do to get ACPI, modem, etc. to work.

Only complaint is that I can't switch out of X to a different virtual console, because when I switch back my display, keyboard and mouse lock hard.

----------

## pjp

 *Xor wrote:*   

> well, my point is, that nVidia produces BS-drivers.... memory leaking and unstable.... but fast - admitted.
> 
> the new ATIs 9x00 do outperform any GForce... and ATI has as far as I know better support for the community.... (better ... not perfect... like matrox 
> 
> to make it clear... I would sacrifice those 5% performence when I have the possibility to get a  - ATI

 If I'm not mistaken, support for Rage 128 cards is awful.

----------

## delta407

FYI: I got a PNY GeForce4 Ti 4400, which is a beast of a card. It's about three times heavier than my GeForce DDR, about two inches longer (I had to re-arrange cables to make it fit), and the card is maroon instead of the standard PCB green. The heatsink/fan combo looks really cool, the fullscreen AA works great, and I've had no stability problems with SMP.

Oh, and the box says it requires a 350-watt power supply. (See? Beast!)  :Very Happy: 

----------

## pjp

 *delta407 wrote:*   

> Oh, and the box says it requires a 350-watt power supply. (See? Beast!) 

   :Shocked: 

----------

## pjp

Anyone familiar with the ASUS V8420 Ti4200 Deluxe?  I'm intrigued by the introduction of this review.   *Quote:*   

>  The V8420 is a Ti4200 with a serious identity crisis. It looks like a Ti4400and it performs like one, but it costs less and comes with a ton of features the Ti4400 doesn't come with...oh yeah, it's a budget card too.

 

later in the article:  *Quote:*   

> Ti4200 Deluxe as it's the sure winner in the sub-$200 category for the enthusiast.

 

----------

## Swishy

 *kanuslupus wrote:*   

>  *delta407 wrote:*   Oh, and the box says it requires a 350-watt power supply. (See? Beast!)   

 

Rekon wonder if the lights dim when he fires the box up , especially with dual processors lol  :Very Happy: 

----------

## syadnom

nvidia would be the best cards to get because of good driver support, but as of jan1, ati has "promised" to fully support linux for all of its graphics cards,  if they can get the drivers right  :Smile: 

----------

## pjp

 *syadnom wrote:*   

> nvidia would be the best cards to get because of good driver support, but as of jan1, ati has "promised" to fully support linux for all of its graphics cards,  if they can get the drivers right 

 Anyone know what their track record has been since Jan 1st?  I'm assuming this is on new cards and not older ones.

----------

## syadnom

that is jan1 2003.... :Smile: 

fully support of all ati chips based on the 8500 or more rescent, including the 9000, 9700, and future..

they already have functional 9700 drivers, as they were showing a realtime rendering demo with blender on a radeon 9700 under linux.

----------

## pjp

:rubs hands together:

Looks like they'll be out for a little while to get some bug fixes before I buy a dual Opteron system.  :Very Happy:   Hopefully ATI support for linux will be even better than nvidia.

----------

## syadnom

im hoping so kanuslupus.  i my prefer ATi to NVidia.  i have never been a fan of NVidia chips, and i was SOOO dissapointed when NVidia bought 3dfx, i miss my voodoo5, and its about time for a VooDoo7(which would be absolutely incredible, and i can say that because its complete vaporware so i can pump up the specs just like the REAL 3d chip companies do  :Smile:  )

----------

## jean-michel

 *Quote:*   

> 
> 
> Has anyone tried dualhead with either ATI or NVIDIA card? How's Xinerama working and how about GLX support and video overlays? 
> 
> 

 

First, I am considering an ATI 9700 if I can determine whether or not their drivers (or the newly released XFree 4.2.1 server) can support DVI output at 1920x1200 resolution (the native resolution for my wicked HDTV/LCD monitor at home).  I have not delved into that deeply yet, but as things stand now I was unable to get my ATI 8500 to do more than 1280x1024 via the DVI port using the xfree drivers.  I am currently using an Nvidia G4 ti4600 with dual DVI out at home, with the current nvidia drivers, which do drive a single monitor very nicely at the aforementioned, obscene resolution.

I use and have deployed nvidia cards at work in the past, and dual head is very easy to get going (no xinerama required if the card is dual-headed, and thus no need to give up glx acceleration).  That having been said, xinerama works just fine with the current nvidia drivers (1.0.3123), indeed it works great even in conjunction with nvidia's dual head support, at least with the quad-head nVidia Quadro4/400NVS I'm using (it appears as two dual headed cards on the pci bus).

Relevant XF86Config details follow (the quad head config uses xinerama to link two screens, each of which is dual headed and drives 2 monitors.  For one dual-headed setup just drop the xinerama config and use the dual headed nvidia stuff within the 'screen' definition)

```

# **********************************************************************

# Graphics device section

# **********************************************************************

Section "Device"

    Identifier  "nvidia-card-1"

    VendorName  "Unknown"

    BoardName   "Unknown"

    Driver      "nvidia"

    BusID       "PCI:3:0:0"

    # Clock lines

    # Uncomment following option if you see a big white block

    # instead of the cursor!

    #    Option      "sw_cursor"

    Option      "DPMS"

EndSection

Section "Device"

    Identifier  "nvidia-card-2"

    VendorName  "Unknown"

    BoardName   "Unknown"

    Driver      "nvidia"

    BusID       "PCI:3:4:0"

    # Clock lines

    # Uncomment following option if you see a big white block

    # instead of the cursor!

    #    Option      "sw_cursor"

    Option      "DPMS"

EndSection

# **********************************************************************

# Screen sections

# **********************************************************************

Section "Screen"

    Identifier "screen1"

    Device      "nvidia-card-1"

    Monitor     "IBM T750 LCD"

    DefaultColorDepth 24

        Option  "IgnoreEDID"    "1"

    Subsection "Display"

        Depth       8

        Modes       "1280x1024" "1152x864" "1024x768" "800x600" "640x480"

        ViewPort    0 0

    EndSubsection

    Subsection "Display"

        Depth       16

        Modes       "1280x1024" "1152x864" "1024x768" "800x600" "640x480"

        ViewPort    0 0

    EndSubsection

    Subsection "Display"

        Depth       24

        Modes       "1280x1024" "1152x864" "1024x768" "800x600" "640x480"

        ViewPort    0 0

    EndSubsection

    Subsection "Display"

        Depth           32

        Modes           "1280x1024" "1152x864" "1024x768" "800x600" "640x480"

        ViewPort        0 0

    EndSubsection

    Option "TwinView"

    Option "ConnectedMonitor" "dfp,dfp"

    Option "SecondMonitorHorizSync"   "30-64"

    Option "SecondMonitorVertRefresh" "56-75"

    Option "TwinViewOrientation"      "LeftOf"

    Option "MetaModes" "1280x1024, 1280x1024"

EndSection

Section "Screen"

    Identifier "screen2"

    Device      "nvidia-card-2"

    Monitor     "IBM T750 LCD"

    DefaultColorDepth 24

        Option  "IgnoreEDID"    "1"

    Subsection "Display"

        Depth       8

        Modes       "1280x1024" "1152x864" "1024x768" "800x600" "640x480"

        ViewPort    0 0

    EndSubsection

    Subsection "Display"

        Depth       16

        Modes       "1280x1024" "1152x864" "1024x768" "800x600" "640x480"

        ViewPort    0 0

    EndSubsection

    Subsection "Display"

        Depth       24

        Modes       "1280x1024" "1152x864" "1024x768" "800x600" "640x480"

        ViewPort    0 0

    EndSubsection

    Subsection "Display"

        Depth           32

        Modes           "1280x1024" "1152x864" "1024x768" "800x600" "640x480"

        ViewPort        0 0

    EndSubsection

    Option "TwinView"

    Option "ConnectedMonitor" "dfp,dfp"

    Option "SecondMonitorHorizSync"   "30-64"

    Option "SecondMonitorVertRefresh" "56-75"

    Option "TwinViewOrientation"      "RightOf"

    Option "MetaModes" "1280x1024, 1280x1024"

EndSection

Section "ServerLayout"

    Identifier "QuadHead"

    Screen "screen1"

    Screen "screen2" RightOf "screen1"

    Screen "screen2"

    Option "Xinerama"

    InputDevice "Mouse1" "CorePointer"

    InputDevice "Keyboard1" "CoreKeyboard"

EndSection

```

I've yet to attempt dual headed support with an ATI card, having been unable to find a dual DVI ati card to my liking that doesn't cost $990, but have read online that a number of them do work (though how well I cannot personally say).

Stability:  All my systems are dual processor (dual PII/450s, dual PIII/500s-PIII/1GHzs, dual Athlons).  The quad headed card has been rock solid (somewhat surprisingly), while some of the cheaper nvidia cards, especially the older geforce mx cards, have had their share of stability problems in the past.  Celestia seems to be able to crash just about any nvidia card (untested on the quad since it is running Gentoo 1.4 and celestia doesn't compile with gcc 3.2 and/or the current opengl stuff).  The ti4600 has been unstable on Gentoo 1.2 and 1.3 systems running XFS, but so far appears to be pretty stable on a Gentoo 1.4 system on an ext3 filesystem (though, as I said, I haven't been able to compile celestia in order to do the 'Celestia crash test' yet).

----------

## jean-michel

I forgot to add that I use mplayer, kino, cinelerra, and other software that routinely uses xv video overlays all the time.  All of them work very nicely with all of the nvidia cards I've mentioned, including the quad-headed Quadro4/400NVS.  The only minor bug seems to be xine's use of xv, which somehow locks xv into a black-and-white mode that only a restart of the x server will clear up (even in other software that otherwise has no trouble using full 24-bit color).  My solution was to unmerge xine and stick with mplayer.  :Sad: 

----------

## aanund

Here is a small generic comparison chart for different gfx cards:

```

Drivers  |  3D  |  2D  |  Manufacturer

---------+------+------+-----------------

    1       3       1      Matrox

    2       1       3      Nvidia

    3       2       2      Ati

```

Now, what does this mean? If you want to desktop applications, you cannot go much wrong with a matrox card, since matrox has CRYSTAL 2D. Also, Matrox has support for 3 screens on the parhelia ( supposedly nice if you need 3 screens  :Smile: .

If you want 3D all day, go for Nvidia, who, _at_the_moment_ outperforms ati, at least if you compare performance to price ( yes yes, i know about 9700, but since those are not readily available, they do not count ). And Nvidia cards generally stink at 2D, being blurry and unfocused.

If you want both, go Ati, Ati has 'almost' as good 2d as matrox, and 'almost' as good 3d as nvidia ( specially with the 9700 ), but getting the cards to perform in linux is something else altogether.

Just my thoughts.

----------

## west

 *aanund wrote:*   

> And Nvidia cards generally stink at 2D, being blurry and unfocused.

 

So true ... My new R8500 blew me away the first time i powered my machine on, i came from a GF2mx ... phew ...

I'm definetly done with Navida after that. Geforce cards almost makes my eyes hurt now  :Wink: 

----------

## sheepdog

 *jean-michel wrote:*   

> I have not delved into that deeply yet, but as things stand now I was unable to get my ATI 8500 to do more than 1280x1024 via the DVI port using the xfree drivers.
> 
> 

 

I have a dual-head ATI Radeon 8500 and I would love to know how you got output on the DVI port.  I really want to use this card with dual monitors.  Any clues you can offer would be greatly appreciated.  Thank you.

----------

