[quake3-bugzilla] [Bug 4244] New: When r_colorbits == 32, GLimp_SetMode() uses wrong bits
bugzilla-daemon at icculus.org
bugzilla-daemon at icculus.org
Wed Jul 15 05:01:59 EDT 2009
http://bugzilla.icculus.org/show_bug.cgi?id=4244
Summary: When r_colorbits == 32, GLimp_SetMode() uses wrong
bits
Product: ioquake3
Version: SVN HEAD
Platform: All
OS/Version: IRIX
Status: NEW
Severity: normal
Priority: P3
Component: Platform
AssignedTo: zakk at icculus.org
ReportedBy: baggett.patrick at gmail.com
QAContact: quake3-bugzilla at icculus.org
Noticed on SGI Octane with MXE graphics running IRIX 6.5.28.
When using the in-game menus to adjust graphics settings, the value of
"r_colorbits" is set to either 16 or 32.
However, in code/sdl_glimp.c:GLimp_SetMode(), this snippet appears:
if (!r_colorbits->value)
colorbits = 24;
else
colorbits = r_colorbits->value;
Shortly after, the for() loop after only references the case of colorbits being
16 or 24. On IRIX, this results in 32-bit color being treated as the "not
24-bit color" case, i.e. 16-bit color. Simply changing the above fragment to:
if (!r_colorbits->value)
colorbits = 24;
else {
colorbits = r_colorbits->value;
if(colorbits == 32)
colorbits = 24;
}
...fixes the problem.
--
Configure bugmail: http://bugzilla.icculus.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.
More information about the quake3-bugzilla
mailing list