draw: initialize shader inputs
authorRoland Scheidegger <sroland@vmware.com>
Tue, 11 Oct 2016 22:00:28 +0000 (00:00 +0200)
committerRoland Scheidegger <sroland@vmware.com>
Wed, 12 Oct 2016 13:05:44 +0000 (15:05 +0200)
This should make the code more robust if a shader tries to use inputs which
aren't defined by the vertex element layout (which usually shouldn't happen).

No piglit change.

Reviewed-by: Brian Paul <brianp@vmware.com>
src/gallium/auxiliary/draw/draw_llvm.c

index 87951fa165da9749f233d2e0dc6c918520a074a0..4270a8fb7a82febc47a742bedf0be3b2905be7e3 100644 (file)
@@ -1705,6 +1705,13 @@ draw_llvm_generate(struct draw_llvm *llvm, struct draw_llvm_variant *variant,
       lp_build_printf(gallivm, " --- io %d = %p, loop counter %d\n",
                       io_itr, io, lp_loop.counter);
 #endif
+
+      for (j = draw->pt.nr_vertex_elements; j < PIPE_MAX_SHADER_INPUTS; j++) {
+         for (i = 0; i < TGSI_NUM_CHANNELS; i++) {
+            inputs[j][i] = lp_build_zero(gallivm, vs_type);
+         }
+      }
+
       for (i = 0; i < vector_length; ++i) {
          LLVMValueRef vert_index =
             LLVMBuildAdd(builder,