Instead of catching the special case early, handle it by constructing a
fake gl_texture_image that will cause the values required by the OpenGL
4.0 spec to be returned.
Previously, calling
glGenTextures(1, &t);
glBindTexture(GL_TEXTURE_2D, t);
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, 0xDEADBEEF, &value);
would not generate an error.
Anuj: Can you verify this does not regress proxy_textures_invalid_size?
Signed-off-by: Ian Romanick <ian.d.romanick@intel.com>
Reviewed-by: Brian Paul <brianp@vmware.com>
Suggested-by: Brian Paul <brianp@vmware.com>
Cc: "10.2" <mesa-stable@lists.freedesktop.org>
Cc: Anuj Phogat <anuj.phogat@gmail.com>
GLenum pname, GLint *params)
{
const struct gl_texture_image *img = NULL;
+ struct gl_texture_image dummy_image;
mesa_format texFormat;
img = _mesa_select_tex_image(ctx, texObj, target, level);
* instead of 1. TEXTURE_COMPONENTS is deprecated; always
* use TEXTURE_INTERNAL_FORMAT."
*/
+ memset(&dummy_image, 0, sizeof(dummy_image));
+ dummy_image.TexFormat = MESA_FORMAT_NONE;
+ dummy_image.InternalFormat = GL_RGBA;
+ dummy_image._BaseFormat = GL_NONE;
- if (pname == GL_TEXTURE_INTERNAL_FORMAT)
- *params = GL_RGBA;
- else
- *params = 0;
- return;
+ img = &dummy_image;
}
texFormat = img->TexFormat;