Skip to content

GPU Validation layer loads 16 bit variable with a 32 bit load #2234

@Pursche

Description

@Pursche

Hi, I have a Vulkan based project using multidraw indirect, dynamic array indexing, 16 bit int types, and GPU culling. After updating to Vulkan SDK 1.2.154.1 my GPU assisted validation layers seems to patch the shader incorrectly.

Here is the HLSL shader in question: https://gist.github.com/Pursche/2efb1fc2a0e74bf20f404e124e68d93a

If I run this with the standard validation, everything renders correctly. If I run it with GPU assisted validation, some of my models instead look completely black and I get a bunch of these validation errors in the console:

[Error]: Validation layer: Validation Error: [ UNASSIGNED-Descriptor uninitialized ] Object 0: handle = 0x2075c48a3a8, type = VK_OBJECT_TYPE_QUEUE; | MessageID = 0x893513c7 | Descriptor index 89130315 is uninitialized. Command buffer (0x207822a1988). Draw Index 0x1. Pipeline (0x20781907b98). Shader Module (0x20781bab228). Shader Instruction Index = 461. Stage = Fragment. Fragment coord (x,y) = (424.5, 752.5). Shader validation error occurred in file mapObject.ps.hlsl at line 90, column 19.

90: float4 tex0 = _textures[material.textureIDs[0]].Sample(_sampler, input.uv01.xy);

I opened this in RenderDoc, and while taking the capture it's still black and prints the errors, but while replaying the capture it looks correct and all my data leading up to the array indexing looks correct.

I noticed that the error always mentions an index around "89130315", so I ran it through a decimal to binary tool and got this:
00000101010100000000010101001011

Splitting this 32 bit value into two 16 bit values and converting them back to decimal gets us these very valid indices:
1360
1355

This made me suspicious that the patched shader was interpreting my uint16_t as a 32 bit integer while indexing into the array, so I tried this workaround in the shader which completely fixes the issue even when GPU validating:

uint textureID0 = material.textureIDs[0];
uint textureID1 = material.textureIDs[1];
float4 tex0 = _textures[textureID0].Sample(_sampler, input.uv01.xy);
float4 tex1 = _textures[textureID1].Sample(_sampler, input.uv01.zw);

I believe that the validation layer patches the shader incorrectly and tries to index into the array with a 32 bit integer.
I have attached the compiled SPIR-V shaders which has this issue.

shaders.zip

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions