Skip to content

Add a texture space shading example#23741

Open
HackerFoo wants to merge 2 commits intobevyengine:mainfrom
HackerFoo:texture-space-shading-example
Open

Add a texture space shading example#23741
HackerFoo wants to merge 2 commits intobevyengine:mainfrom
HackerFoo:texture-space-shading-example

Conversation

@HackerFoo
Copy link
Copy Markdown
Contributor

Objective

This adds an example of how to render textures at runtime, which is called "texture space shading."

This should provide a starting point for anyone who wants to use this technique.

Solution

This was adapted from the custom_post_processing example.

Testing

  • This was tested on MacOS.

Showcase

Screenshot 2026-04-09 at 15 37 37

using: cargo run -p build-templated-pages -- update examples
@HackerFoo HackerFoo force-pushed the texture-space-shading-example branch from 5bbaadb to f9f4202 Compare April 9, 2026 22:51
@kfc35 kfc35 added A-Rendering Drawing game state to the screen C-Examples An addition or correction to our examples S-Needs-Review Needs reviewer attention (from anyone!) to move forward labels Apr 10, 2026
@github-project-automation github-project-automation Bot moved this to Needs SME Triage in Rendering Apr 10, 2026
Copy link
Copy Markdown
Contributor

@mate-h mate-h left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice example I think this is a super common use case and should be included

Vec2::new(row as f32 / 3., col as f32 / 3.)
};
let mut cube_mesh = Mesh::from(Cuboid::default());
// Rewrite UVs so they don't overlap
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the default cube mesh produce overlapping uvs? Seems like a bug is patched in the example

Or is this expected behavior for the cube primitive

Copy link
Copy Markdown
Contributor Author

@HackerFoo HackerFoo Apr 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, it maps the entire UV space to each face. This could be okay depending on what you're doing, say a Minecraft style grass block, except you'd probably want the top and bottom to be different from the sides. But for general use, it'd be better if no texture space was shared between surfaces.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let vertices = &[
// Front
([min.x, min.y, max.z], [0.0, 0.0, 1.0], [0.0, 0.0]),
([max.x, min.y, max.z], [0.0, 0.0, 1.0], [1.0, 0.0]),
([max.x, max.y, max.z], [0.0, 0.0, 1.0], [1.0, 1.0]),
([min.x, max.y, max.z], [0.0, 0.0, 1.0], [0.0, 1.0]),
// Back
([min.x, max.y, min.z], [0.0, 0.0, -1.0], [1.0, 0.0]),
([max.x, max.y, min.z], [0.0, 0.0, -1.0], [0.0, 0.0]),
([max.x, min.y, min.z], [0.0, 0.0, -1.0], [0.0, 1.0]),
([min.x, min.y, min.z], [0.0, 0.0, -1.0], [1.0, 1.0]),
// Right
([max.x, min.y, min.z], [1.0, 0.0, 0.0], [0.0, 0.0]),
([max.x, max.y, min.z], [1.0, 0.0, 0.0], [1.0, 0.0]),
([max.x, max.y, max.z], [1.0, 0.0, 0.0], [1.0, 1.0]),
([max.x, min.y, max.z], [1.0, 0.0, 0.0], [0.0, 1.0]),
// Left
([min.x, min.y, max.z], [-1.0, 0.0, 0.0], [1.0, 0.0]),
([min.x, max.y, max.z], [-1.0, 0.0, 0.0], [0.0, 0.0]),
([min.x, max.y, min.z], [-1.0, 0.0, 0.0], [0.0, 1.0]),
([min.x, min.y, min.z], [-1.0, 0.0, 0.0], [1.0, 1.0]),
// Top
([max.x, max.y, min.z], [0.0, 1.0, 0.0], [1.0, 0.0]),
([min.x, max.y, min.z], [0.0, 1.0, 0.0], [0.0, 0.0]),
([min.x, max.y, max.z], [0.0, 1.0, 0.0], [0.0, 1.0]),
([max.x, max.y, max.z], [0.0, 1.0, 0.0], [1.0, 1.0]),
// Bottom
([max.x, min.y, max.z], [0.0, -1.0, 0.0], [0.0, 0.0]),
([min.x, min.y, max.z], [0.0, -1.0, 0.0], [1.0, 0.0]),
([min.x, min.y, min.z], [0.0, -1.0, 0.0], [1.0, 1.0]),
([max.x, min.y, min.z], [0.0, -1.0, 0.0], [0.0, 1.0]),
];

Copy link
Copy Markdown
Contributor

@eswartz eswartz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This adds an example of how to render textures at runtime, which is called "texture space shading."

Is it really? 😕 I think "Texture space shading" is a misleading name for this example, based on what the literature (read: top Google searches ;) say.

AFAICT that concept refers to a much more involved full-scene and inter-frame rendering pipeline (see from 2016 http://diglib.eg.org/bitstream/handle/10.2312/egsh20161018/073-076.pdf or from 2025 https://arxiv.org/abs/2502.17712).

This example doesn't seem to exhibit that full pipeline, but is "just" rendering to a texture from a shader and letting that texture be drawn on a mesh cube. (It's using texture coordinates/"space", but that's obvious for a fragment shader.)

I agree the example useful, but I wouldn't look for the sample under this name and it might disappoint people looking for the aforementioned technique. Would it be better to title/name files like you summarized, i.e. "Render to texture from shader" and/or "shader_render_to_texture.rs" or something?

@HackerFoo
Copy link
Copy Markdown
Contributor Author

Here's the best description I can find of texture space shading (TSS):

Turing GPUs introduce a new shading capability called Texture Space Shading (TSS), where shading values are dynamically computed and stored in a texture as texels in a texture space. Later, pixels are texture mapped, where pixels in screen-space are mapped into texture space, and the corresponding texels are sampled and filtered using a standard texture lookup operation. With this technology we can sample visibility and appearance at completely independent rates, and in separate (decoupled) coordinate systems. Using TSS, a developer can simultaneously improve quality and performance by (re)using shading computations done in a decoupled shading space.

So, with TSS:

  • "Shading values" are computed in texture (UV) space and stored as texels.
  • They are texture mapped to screen space, and sampled and filtered in the usual way.
  • We can sample visibility and appearance separately.
  • The value is in the fact that texture space values can be reused with different views.

This example certainly computes in texture space, although I'm not sure what counts as a shader value, but it seems like that would be whatever a fragment shader writes to its target. I also use a timer so that the same values are used for multiple frames, reducing computation of these values. This example is also quite different from rendering a camera to a texture, since this shader is view independent, and there is a target for each mesh.

The general idea is to store data about the surface of a mesh in a way that it can be reused, using a texture not as a read-only asset but as temporary memory, mapping from a surface in local 3D space to a 2D texture and back again, in possibly a different world space.

I agree that making the best use of this technique, where the data isn't entirely view independent, require more sophisticated techniques, which I will be exploring, but they are probably out of the scope of Bevy, and definitely out of the scope of this example.

I didn't know about texture space shading before I started exploring this technique, and Noxim pointed the term out to me. I found a lot of relevant research by searching for "texture space shading" so I think using this is valuable so others can both find this example and also to find relevant research.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

A-Rendering Drawing game state to the screen C-Examples An addition or correction to our examples S-Needs-Review Needs reviewer attention (from anyone!) to move forward

Projects

Status: Needs SME Triage

Development

Successfully merging this pull request may close these issues.

4 participants