diff options
author | Ryan Neph <ryanneph@google.com> | 2024-05-15 10:50:21 -0700 |
---|---|---|
committer | Ryan Neph <ryanneph@google.com> | 2024-05-15 12:00:08 -0700 |
commit | 328e5f668c2b497225f3b59883fc923be8527272 (patch) | |
tree | 8eb7f4cd8112a66639c7ae625b135d5e9a06131a | |
parent | 83e5309267653a2c606c32518dd92a5b9d5b3eb8 (diff) | |
download | virglrenderer-upstream-main.tar.gz |
vrend: fix incorrect limit sent for max samplersupstream-main
The original support for this driver cap ("max_texture_image_units")
conflated the limit for glsl texture "samplers" and the limit for
bindable texture "image units".
As a result, the guest driver has ALWAYS incorrectly used the advertised
value to inform Galliums internal query for
`PIPE_SHADER_CAP_MAX_TEXTURE_SAMPLERS` rather than the semantically
intended behavior of informing a client query for
`GL_MAX_TEXTURE_IMAGE_UNITS`.
This leads to a crash-bug during driver initialization caused by
compiler stack protection triggering on stack overflow in
virgl_bind_shader_states() on devices for which a native driver query
for GL_MAX_TEXTURE_IMAGE_UNITS exceeds Mesa's PIPE_MAX_SAMPLERS (32).
Mali devices are one example where this is true (those return 64 and 128
in many cases).
THE FIX:
Virglrenderer should rename this cap to capture its actual semantic
meaning of "max_texture_samplers", matching the way the driver has
ALWAYS used it. Then virglrenderer will (with this and later versions)
always send PIPE_MAX_SAMPLERS (32) to match that meaning.
Thus this cap becomes truly useless (it already was but now the naming
and value actually make sense), but the crashing is solved on all
versions of the guest driver.
Later we will implement the ACTUAL cap as a new one as it was intended,
gated by a protocol revision bump as it was intended.
Fixes: d755dfa28c ("vrend: Sync some constants with mesa and pass max_shader_samplers")
Signed-off-by: Ryan Neph <ryanneph@google.com>
Part-of: <https://gitlab.freedesktop.org/virgl/virglrenderer/-/merge_requests/1368>
-rw-r--r-- | src/virgl_hw.h | 5 | ||||
-rw-r--r-- | src/vrend_renderer.c | 11 |
2 files changed, 13 insertions, 3 deletions
diff --git a/src/virgl_hw.h b/src/virgl_hw.h index 5006639c..613e8fb2 100644 --- a/src/virgl_hw.h +++ b/src/virgl_hw.h @@ -763,7 +763,10 @@ struct virgl_caps_v2 { uint32_t max_video_memory; char renderer[64]; float max_anisotropy; - uint32_t max_texture_image_units; + // NOTE: this informs guest-side PIPE_SHADER_CAP_MAX_TEXTURE_SAMPLERS, + // **NOT** GL_MAX_TEXTURE_IMAGE_UNITS!!! + // Guest-side driver has always used it as such. + uint32_t max_texture_samplers; struct virgl_supported_format_mask supported_multisample_formats; uint32_t max_const_buffer_size[6]; // PIPE_SHADER_TYPES uint32_t num_video_caps; diff --git a/src/vrend_renderer.c b/src/vrend_renderer.c index 7d2497a3..798ed821 100644 --- a/src/vrend_renderer.c +++ b/src/vrend_renderer.c @@ -12665,8 +12665,15 @@ static void vrend_renderer_fill_caps_v2(int gl_ver, int gles_ver, union virgl_c caps->v2.max_anisotropy = MIN2(max_aniso, 16.0); } - glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS, &max); - caps->v2.max_texture_image_units = MIN2(max, PIPE_MAX_SHADER_SAMPLER_VIEWS); + /* BUGFIX: original support for this cap was broken. It was _supposed_ to + * inform a guest-side client's query for GL_MAX_TEXTURE_IMAGE_UNITS, but + * only ever informed gallium's internal query for + * PIPE_SHADER_CAP_MAX_TEXTURE_SAMPLERS. This caused crashes on Mali. + * + * Now we send the "correct" value, to ensure good behavior from old guest + * drivers, but it is not "useful" for any other purpose. + */ + caps->v2.max_texture_samplers = PIPE_MAX_SAMPLERS; if (has_feature(feat_ubo)) { glGetIntegerv(GL_MAX_UNIFORM_BLOCK_SIZE, &max); |