[Gtkradiant] [Bug 846] depth testing

gtkradiant@zerowing.idsoftware.com gtkradiant@zerowing.idsoftware.com
Sun, 27 Jul 2003 18:38:20 -0500


http://zerowing.idsoftware.com/bugzilla/show_bug.cgi?id=846





------- Additional Comments From rfm@collectivecomputing.com  2003-07-27 18:38 -------
If you use ChoosePixelFormat to select your pixelformat, and ask for a 32 bit
depth buffer, some implementations still give you 16 even when they could give
you 24. There was a discussion of this on the opengl-gamedev-l list a while
back. Here:
http://groups.yahoo.com/group/opengl-gamedev-l/messages/21122?threaded=1
This message is particularly informative:
http://groups.yahoo.com/group/opengl-gamedev-l/message/21123?threaded=1

Nothing to do with stencil buffor or lack thereof.

Specifying exactly 24 would get around this particular case, but would
pointlessly reduce precision for those people with workstation class cards, and
some other card which only had 32 and 16 might still do the wrong thing. I
suppose you could make a pref for 'requested depth precision' :/

If you care exactly what you get, you should not be using ChoosePixelFormat, but
instead listing all the pfds and finding the one right for your needs. 
I don't know if the newer nvidia cards are capable of giving you a 32 bit
z-buffer. The gf2 class ones certainly were not.