[ut2004] ./ut2004-bin: error while loading shared libraries: libstdc++.so.5
spike at spykes.net
Mon Jun 20 19:34:34 EDT 2005
Yeah I realized ATI's drivers exposes 6 TMU's where nVidia's exposes
only 4. I find its sucky too and even asked Christian Zander to change
this but he says its the developers problem. Yes its silly, considering
my card has 16 whole TMU's. nVidia's FAQ site states to use
ARB_fragment_program to get the extra TMU's. I personally think
nVidia's drivers should allow 6 TMU's instead. Perhaps I should bug
Zander some more and perhaps forward this to him.
On Mon, 20 Jun 2005 19:24:16 -0400
"Ryan C. Gordon" <icculus at clutteredmind.org> wrote:
> > Do you have any idea on the power core rendering issues with nVidia
> > hardware? If that were fixed in this new binary, then there wouldnt be
> > any differences between the Linux and Windows versions of the game and
> > then the book can be closed on any graphical issues.
> Nvidia's drivers only expose 4 texture units in the fixed-function
> pipeline, which is silly, and prevents us from rendering the power cores
> correctly. You'll note that ATI's drivers on Linux do not have this
> problem. No one can give me a solid answer as to why Nvidia's drivers do
> this, since the hardware has more than 4 TMUs, works fine in the
> Direct3D fixed-function pipeline, and works fine in OpenGL with the
> ARB_fragment_program extension.
> Nvidia's answer to this was "rewrite your renderer to use pixel shaders
> and you can have more than 4 TMUs". That is obviously not going to happen.
> Granted, this is a problem that goes away by default for UnrealEngine3,
> but it's pretty annoying for UT2004.
> Sorry this isn't the best answer, but really, it's out of my hands.
More information about the ut2004