glx: Do not advertise buffer_age on dri2
authorAdel Gadllah <adel.gadllah@gmail.com>
Sun, 30 Mar 2014 21:36:25 +0000 (23:36 +0200)
committerDave Airlie <airlied@redhat.com>
Wed, 2 Apr 2014 20:28:26 +0000 (21:28 +0100)
Previously GLX_EXT_buffer_age has always been advertised as supported because
both client_glx_support and client_glx_only where set. So it did not matter
that direct_support is only set when running dri3 and we ended up always
advertising it.

Fix that by not setting client_glx_only for buffer_age in known_glx_extensions.

Signed-off-by: Adel Gadllah <adel.gadllah@gmail.com>
Reviewed-by: Ian Romanick <ian.d.romanick@intel.com>
Signed-off-by: Dave Airlie <airlied@redhat.com>
src/glx/glxextensions.c

index ac1b4a7b02ee249eb0447b5e4531d95e76dff8ad..ce5d66d4f079a5befd3c8c0e3821c958a1998dd2 100644 (file)
@@ -103,7 +103,7 @@ static const struct extension_info known_glx_extensions[] = {
    { GLX(SGIX_visual_select_group),    VER(0,0), Y, Y, N, N },
    { GLX(EXT_texture_from_pixmap),     VER(0,0), Y, N, N, N },
    { GLX(INTEL_swap_event),            VER(0,0), Y, N, N, N },
-   { GLX(EXT_buffer_age),              VER(0,0), Y, N, Y, Y },
+   { GLX(EXT_buffer_age),              VER(0,0), Y, N, N, Y },
    { NULL }
 };