[ut2003io] CPU Performance (was: Color depth)

Daniel Borgmann daniel at liebesgedichte.net
Wed Sep 18 17:10:58 EDT 2002


Thank you all. :)
Unfortunatly 16 bit did only increase the benchmark result by 0.02 FPS. 
Today I also found out that after a lot of tweakage I get only 2 FPS
more than my father. This is kinda sad because his system is a 1000 Mhz
Athlon with a Geforce 2 PRO and my system is a 1000 Mhz P3 (clocked at
1080) and a Geforce 4 Ti4600 which I spent 500 bucks for. :/ 
Are there any plans to add more performance options for "low" CPU's for
the final game? Like reduced polycount models, less effects, low quality
HUD, etc. It's a little sad to spend 500 bucks for just a few FPS (my
father got my old GPU). I would love to buy and play this game but
currently the bad FPS make it a far less enjoyable experience (for
example I play without sound because this helps a lot).

Daniel


On Tue, 2002-09-17 at 01:24, Ryan C. Gordon wrote:
> > Now I'm wondering, does the setting in the game actually do anything or
> > do I really have to start the X server in 16 bit mode to get a lower
> > color depth? \
> 
> It's not hooked up for Linux, since XFree can't switch color depths on 
> the fly. Some X servers can (although I don't know if any X server for 
> Linux can), but I'm not willing to hook it up and field a million bug 
> reports from XFree users. Restart X in 16-bit mode if you want to try.
> 
> For Direct3D on Windows, you can switch color depths from your desktop's 
> default if you are running fullscreen. On Linux, we just use whatever 
> color depth your X server is running at, period.
> 
> > On a related note (but off topic), does anyone know how to enable vsync
> > with XFree and Nvidia driver? I couldn't find any infos about this in
> > the net (searching for vsync and xfree isn't very productive :/).
> 
> It's in the README that comes with Nvidia's GLX implementation.
> 
> export __GL_SYNC_TO_VBLANK=1
> 
> (I think)
> 
> --ryan.
> 
> 
> 





More information about the ut2003 mailing list