While it's a nice idea to be able to allow clients to choose a smaller
(or bigger for 16bpp screens!) depth size, right now DRI2 hands back a buffer
with a size that matches the drawable, rather than being based off of the
visual. This led to problems in readback as parts of the driver disagreed
on what format the depth buffer was really in.
Fixes the remainder of bug #19447.
fb_format[2] = GL_BGRA;
fb_type[2] = GL_UNSIGNED_INT_8_8_8_8_REV;
+ depth_bits[0] = 0;
+ stencil_bits[0] = 0;
+
for (color = 0; color < ARRAY_SIZE(fb_format); color++) {
__DRIconfig **new_configs;
+ int depth_factor;
+
+ /* With DRI2 right now, GetBuffers always returns a depth/stencil buffer
+ * with the same cpp as the drawable. So we can't support depth cpp !=
+ * color cpp currently.
+ */
+ if (fb_type[color] == GL_UNSIGNED_SHORT_5_6_5) {
+ depth_bits[1] = 16;
+ stencil_bits[1] = 0;
+
+ depth_factor = 2;
+ } else {
+ depth_bits[1] = 24;
+ stencil_bits[1] = 0;
+ depth_bits[2] = 24;
+ stencil_bits[2] = 8;
+ depth_factor = 3;
+ }
new_configs = driCreateConfigs(fb_format[color], fb_type[color],
depth_bits,
stencil_bits,
- ARRAY_SIZE(depth_bits),
+ depth_factor,
back_buffer_modes,
ARRAY_SIZE(back_buffer_modes),
msaa_samples_array,