# GeForce FX 5900XT slow... need Guru.

## OnoSendai

Hi.. my GeForce FX 5900XT is running very slow:

on glxgears i get about

2383 frames in 5.0 seconds = 476.600 FPS.

Benchmarking with ut2004 in 800x600 low quality gives me about 38 FPS.

My computer:

Athlon XP 2600+

256 MB ram

GeForce FX 5900XT

cat /proc/driver/nvidia/agp/status gives me

```

Status: Enabled

Driver: NVIDIA

AGP Rate: 8x

Fast Writes: Enabled

SBA: Enabled

```

dmesg | grep agp gives me

```

Linux agpgart interface v0.100 (c) Dave Jones

```

glxinfo gives me:

```

...

direct rendering: Yes

server glx vendor string: NVIDIA Corporation

server glx version string: 1.3

...
```

The additional power conector is used, and i'm running the latest driver (1.0.6111). Kernel version is 2.6.8.

I don't know what to do..

any suggestions?

thx

Henrik

----------

## _Nomad_

Well... I'm afraid there really isn't that much one can do... nvidia-driver being closed source and all... However I've noticed that this latest release of nvidia drivers sometimes have some serious issues with performance... Try downgrading to 6106-r1 and see if it makes any difference...

Also... I'd recommend using nvidia's AGP interface... just set NVAGP=1 in xorg.conf...

----------

## Satyrinox

would you care to elaborate , where do you put that in xorg.conf ?

 :Question: 

----------

## John5788

try using my nvidia config, i have the same GPU as u do, geforce fx5900xt

```
Section "Device"

    Identifier  "NVIDIA GeForce"

    Driver      "nvidia"

    Option      "NvAGP" "3"

    Option      "NoLogo" "1"

    Option      "RenderAccel" "1"

    Option      "CursorShadow" "1"

    Option      "ConnectedMonitor" "CRT"

    VideoRam    131072

EndSection
```

that and 6111 drivers for nvidia i get 7000+ fps in glxgears.

----------

## Muso

I use the same card ... here is my driver section..

```
Section "Device"

    Identifier  "GeforceFX"

    Driver      "nvidia"

    Option      "NoLogo"        "True"

    Option      "HWCursor"     "false"

    Option      "DigitalVibrance" "35"

    Option      "NvAgp"            "3"

    Option      "RenderAccel"   "True"

    #VideoRam    131072

    # Insert Clocks lines here if appropriate

EndSection

```

I'm using 1280x1024 and get average 9650fps w/glxgears

----------

## John5788

are u also using 16bit?

----------

## OnoSendai

Ok.. Here is what i've tried:

First i disconnected the additional power connector of the GeForce card to check if it's working. The Benchmark results showed the card was working well... 

So i connected the card again...

Then i switched back from xorg to xfree, and got exactly the same results.

So i installed xorg again.

Downgrading the driver didn't work either...

I'm running a TwinView setup in 24bit color depth:

Here is the significant part of my conf:

```
Section "Device"

    Identifier  "GeForce FX-5900XT"

    Driver      "nvidia"

    VideoRam    131072

    BusID       "PCI:1:0:0" 

    Option      "NvAgp" "1" 

    Option      "RenderAccel" "on"

    Option      "NoLogo" "true"

    Option      "DPMS"

    Option      "TwinView"

    Option      "SecondMonitorHorizSync"     "31.5-96"

    Option      "SecondMonitorVertRefresh"   "50-160"

    Option      "MetaModes"                  "1280x1024,1280x1024; 1024x768,null; 800x600,null; 640x480,null"

    Option      "ConnectedMonitor"         "crt,crt"

EndSection

```

Now i've configured a single screen layout and switched the default color depth to 16 bit:

```
Section "Device"

    Identifier  "GeForce FX-5900XT"

    Driver      "nvidia"

    Option      "NoLogo"        "True"

    Option      "HWCursor"     "false"

    Option      "DigitalVibrance" "35"

    Option      "NvAgp"            "3"

    Option      "RenderAccel"   "True" 

EndSection
```

And now it got even more intresting. The glxgears score didn't change, and the ut2004 benchmark got slightly worse (!).

My results are now an average frame rate of 36 FPS.

Switched back to the TwinView layout.

Now i removed agpgart from my kernel.

dmesg | grep agp gives me nothing, but 

cat /proc/driver/nvidia/agp/status still

```
Status:          Enabled

Driver:          NVIDIA

AGP Rate:        8x

Fast Writes:     Enabled

SBA:             Enabled
```

But the results didn't change.

Are the information provided by /proc/driver/nvidia/agp/status reliable?

I'm suggesting my agp support is screwd up...

According to http://www.linuxforen.de/forums/showthread.php?t=146195&highlight=benchmark+ut2004 i should get in ut2004 at least 50 FPS out of my box...

Maybe i'll get a second harddisk, and install somting like fedora or mandrake (or eaven w2k.. no.. just kidding..  :Wink: ) on it and do the benchmark there...

----------

## Matrix7

Did you remember to run:

opengl-update nvidia

after emergeing xorg?

----------

## OnoSendai

yip.. i did a opengl-update nvidia...

----------

## OnoSendai

Ok.. I've installed Mandrake on an old harddisk. And although the system feels a lot slower, i got much higher scores in 3d benchmarks. 

glxgears now gives me

31214 frames in 5.0 seconds = 6242.800 FPS

And on the ut2004 benchmark i now get a score of 59 FPS.

Thats nearly twice as fast (!)...

I used the same nvidia drivers (1.0.6111) but the kernel version was 2.6.3... 

The informations given by glxinfo are the same on both systems... 

```
mandrake # glxinfo > glxinfomandrake

gentoo # glxinfo > glxinfogentoo

diff glxinfogentoo glxinfomandrake

```

I thought maybe it had something to do with my kernel - config. So i mad a "make oldconfig" under Mandrake, copied the config to my Gentoo system, and recompieled my kernel.

It didn't make any difference.. still as slow as ever...  :Sad: 

Maybe i should downgrade my kernel to 2.6.3? But i can't imagine that that would help...

Btw.. What about the use flags? I'm currently using

```
USE="3dnow X acpi alsa apm -arts audiofile avi cdr crypt cups dga directfb divx4linux doc dvb dvd dvdr encode -esd fbcon flash foomaticdb gif gphoto2 gpm gtk2 gstreamer icq imagemagick jack java joystick jpeg -kde libwww lirc maildir mmx mozilla mpeg mysql ncurses nls offensive oggvorbis opengl oscar -pcmcia pda pdflib plotutils png ppds -qt quicktime readline samba scanner sdl slp spell sse tiff truetype unicode usb videos wxwindows xinerama xml2 xmms xv xvid zlib x86"
```

I still have no idea why it dosn't work out...

----------

## sindre

Maybe Mandrake uses the kernels agpgart, while you use the nvidia agp-driver on gentoo. If you want to use the kernel agpgart you need support compiled for the right chipset.

----------

## OnoSendai

oh.. sorry... i forgot to mention... the mandrake kernel indeed uses the agpgart module as well as the via-agp module for my agp chipset... 

Here the lsmod list:

```
Module                  Size  Used by

ipv6                  248644  6

sg                     37792  0 

ohci_hcd               20484  0 

via_rhine              20296  0 

via_ircc               24656  0 

irda                  131196  1 via_ircc

crc_ccitt               2176  1 irda

8139cp                 19712  0 

es1371                 33024  0 

ac97_codec             18316  1 es1371

via_agp                 9088  1 

agpgart                33512  2 via_agp

evdev                   9472  0 

tsdev                   7296  0 

usbhid                 42496  0 

parport_pc             34304  1 

lp                     11692  0 

parport                40200  2 parport_pc,lp

snd_ens1371            23972  0 

snd_rawmidi            24292  1 snd_ens1371

snd_seq_device          8200  1 snd_rawmidi

snd_pcm                94152  1 snd_ens1371

snd_page_alloc         11464  1 snd_pcm

snd_timer              24324  1 snd_pcm

snd_ac97_codec         67460  1 snd_ens1371

snd                    55268  6 snd_ens1371,snd_rawmidi,snd_seq_device,snd_pcm,snd_timer,snd_ac97_codec

gameport                4608  2 es1371,snd_ens1371

tuner                  18960  0 

af_packet              21320  0 

tvaudio                22092  0 

msp3400                23316  0 

bttv                  151116  0 

video_buf              21124  1 bttv

i2c_algo_bit            9224  1 bttv

v4l2_common             6272  1 bttv

btcx_risc               4744  1 bttv

i2c_core               23440  5 tuner,tvaudio,msp3400,bttv,i2c_algo_bit

videodev                9728  1 bttv

soundcore               9696  3 es1371,snd,bttv

usbmouse                5632  0 

ehci_hcd               28228  0 

uhci_hcd               31248  0 

usbcore               112804  7 ohci_hcd,usbhid,usbmouse,ehci_hcd,uhci_hcd

nvidia               4818580  12 

8139too                24192  0 

mii                     5056  3 via_rhine,8139cp,8139too

rtc                    11704  0 

ide_tape               34960  0 

st                     38300  0 

ide_cd                 39904  0 

sr_mod                 17828  0 

scsi_mod              115596  3 sg,st,sr_mod

cdrom                  37724  2 ide_cd,sr_mod

```

I swiched my X-config to

```
Option      "NvAgp" "3"
```

So cat /proc/driver/nvidia/agp/status gives

```
Status:          Enabled

Driver:          AGPGART

AGP Rate:        8x

Fast Writes:     Enabled

SBA:             Enabled
```

----------

## Riftwing

Erm... you DID plug the power cord into the card right? If it isn't plugged in you'll get pretty terrible performance.

----------

## OnoSendai

Yip.. I did it... Disconnecting it leeds to a very poor performance.. i.e. glxgears about 100 FPS and UT2004 about 7 FPS...

----------

## Epyon

I was about to say I had the same problem but with a 6800 gt but then I remembered I was running folding@home. With that running I was maxing out at 250 fps. When I remembered to shut it down I was getting 12000+ fps in glxgears.

----------

## OnoSendai

Maybe its something in my USE flags...

I guess thats the last idea i got, since i've already downgraded my kernel to 2.6.3 with no result...

@Epyon:

Would you mind posting your USE flags?

----------

## r3pek

are you running seti@home or folding@home?

if you are, just stop them whem you're using something that really needs all the power from the GPU. don't know why, but somehow, seti@home and folding@home slows down the GPU a LOT!!!!! i'm talking about a drop of 9600 fps in glxgears (from 9700+- to 100+- with a nvidia 5950 ultra)

----------

## OnoSendai

No.. the only thing i'm running in the backround is a proftp deamon... 

But the deamon is mostly idle:

```
# uptime

 16:01:40 up  2:06,  2 users,  load average: 0.00, 0.00, 0.06
```

Stopping the deamon still gives me the same results in the benchmarks...

So i guess it could be an agp issue, or there is something wrong with my USE flags...

----------

## u2mike

You could see if Anti-aliasing is on, that would slow it down. But with a 5900 you would still get good results.

I would say it has to be something kernel related. Try a different kernel, make sure you enable support for your chipset, otherwise don't screw with the default kernel settings to much and see what happens.

----------

## RobDin

Maybe this suggestion works for you. I can remember reading somewhere that using twinview downgrades the performance of both your screen(because the card has to divide it's processing power to two screens). So try to turn of the twinview, but leave the rest of your config as it is.

so change this code:

```

Section "Device"

    Identifier  "GeForce FX-5900XT"

    Driver      "nvidia"

    VideoRam    131072

    BusID       "PCI:1:0:0"

    Option      "NvAgp" "1"

    Option      "RenderAccel" "on"

    Option      "NoLogo" "true"

    Option      "DPMS"

    Option      "TwinView"

    Option      "SecondMonitorHorizSync"     "31.5-96"

    Option      "SecondMonitorVertRefresh"   "50-160"

    Option      "MetaModes"                  "1280x1024,1280x1024; 1024x768,null; 800x600,null; 640x480,null"

    Option      "ConnectedMonitor"         "crt,crt"

EndSection

```

to something like this code:

```

Section "Device"

    Identifier  "GeForce FX-5900XT"

    Driver      "nvidia"

    VideoRam    131072

    BusID       "PCI:1:0:0"

    Option      "NvAgp" "1"

    Option      "RenderAccel" "on"

    Option      "NoLogo" "true"

    Option      "DPMS"

    #Option      "TwinView"

    #Option      "SecondMonitorHorizSync"     "31.5-96"

    #Option      "SecondMonitorVertRefresh"   "50-160"

    #Option      "MetaModes"                  "1280x1024,1280x1024;1024x768,null;800x600,null; 640x480,null"

    #Option      "ConnectedMonitor"         "crt,crt"

EndSection

```

Maybe it helps....

Good luck

----------

## OnoSendai

@RobDin: I've already tried that, but i couldn't notice a difference... 

@u2mike: I'm a dumbass...  :Very Happy:  After i bought the card, i set antialiasing and stuff in /etc/env.d/99local:

```

# cat /etc/env.d/99local  

CVSROOT='/home/cvs'

LANG='de_DE@euro'

__GL_FSAA_MODE='5'

__GL_DEFAULT_LOG_ANISO='3'

__GL_SYNC_TO_VBLANK='1'

LD_LIBRARY_PATH='/opt/sun-jdk-1.5.0_beta2'

JAVA_HOME='/opt/sun-jdk-1.5.0_beta2'

PAGER='/usr/bin/most'

PATH='${PATH}:./'

```

So i disabled vblank syncing, full scene antialiasing and anisotropic texture filtering:

```
# vi /etc/env.d/99local 

__GL_FSAA_MODE='0'

__GL_DEFAULT_LOG_ANISO='0'

__GL_SYNC_TO_VBLANK='0'
```

And finally i got in glxgears:

```
# glxgears

31336 frames in 5.0 seconds = 6267.200 FPS
```

Now i got the same glxgears score in mandrake and gentoo.. 

BUT:

The framerate in ut2004 didn't change.    :Sad: 

Then i disabled agp temporaly ( Option      "NvAgp" "0" ) , ran the benchmark again, and got an result of 36 FPS instead of the usual 38 FPS! Enabled agp again.. 38 FPS.. I actually believed in a much bigger performace boost from agp !?!

Is that realistic?

----------

## bienchen

OK, this is just a tip how I got my card working well:

Just forgett about any experimental things - nu unmasked gentoo ebuilds, no "newest" nvidia driver - just go with the driver wich is in the actual stable portage tree. Install it, change your XFree86.conf to use nvidia instead of nv and restart xserver...

I've tried the newest nvidia-driver witht the result of now real 3D experience...now I'm running one of the old ones and it works well.

----------

## hardcore

First off, don't use glxgears as a benchmarking tool, use it to see if you have glx working.  Second, emerge nvidia-settings, go to antialiasing settings, and choose over-ride application settings for both Antialiasing and Antistropic filtering.  Now try your benchmarks.

----------

## OgRo

I have a XP2.8, asus A7V600-X MB and same video adaptor and I'm having the same results.

I don't know (yet) if the cable is conected, but I facing a weird problem.

I downgraded my nvidia-* with this:

```
emerge unmerge nvidia-kernel nvidia-glx

ACCEPT_KEYWORDS="~x86" emerge --ask =media-video/nvidia-kernel-1.0.6106-r1 =media-video/nvidia-glx-1.0.6106-r3
```

then, when I try to load nvidia module (even if I reboot) i get a:

 *Quote:*   

> FATAL: Error inserting nvidia (/lib/modules/2.6.9-gentoo-r1/video/nvidia.ko): Unknown symbol in module, or unknown parameter (see dmesg)

 

and my log has:

 *Quote:*   

> Oct 30 01:46:25 angel nvidia: Unknown symbol __VMALLOC_RESERVE

 

Also I've found that my messages has lots of lines like:

 *Quote:*   

> Sep  9 23:37:17 angel nvidia: module license 'NVIDIA' taints kernel.
> 
> Sep  9 23:37:17 angel NVRM: loading NVIDIA Linux x86 NVIDIA Kernel Module  1.0-6111  Tue Jul 27 07:55:38 PDT 2004

 

notice my clock is set by ntp and it's right.

any tip?

UPDATE:

sorry, my video card is a FX5200 and I just found out it's crap. :-/

But, still, my glxgears give me (when maximized):

 *Quote:*   

> 833 frames in 5.0 seconds = 166.600 FPS

 

Also, what tool should I use for benchmarking

----------

## hardcore

 *OgRo wrote:*   

> I have a XP2.8, asus A7V600-X MB and same video adaptor and I'm having the same results.
> 
> I don't know (yet) if the cable is conected, but I facing a weird problem.
> 
> I downgraded my nvidia-* with this:
> ...

 An actual game like Enemy Territory or Unreal Tournament 2004.

----------

## MaDDeePee

I can CONFIRM that the 1.0-6111 slows down my gentoo-box critical (On games like ET), the older driver works perfect!

funny, isnt it?   :Very Happy: 

----------

## racoontje

The exact version of the older driver + changelog please!

----------

## srpape

Hi,

I had this same problem with the same card. I recompiled my kernel and re emerged the nvidia driver and it started working fine.

Maybe was something to do with my upgrading to GCC 3.4 and having the kernel still compiled with 3.3? I believe I updated the driver after, and it would have been built with 3.4 with the kernel still on 3.3. 

Good luck.

----------

