Skip to content

Renderer: Add MRT support to .readRenderTargetPixels(). #22403

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
LuckyTeresa opened this issue Aug 24, 2021 · 8 comments · May be fixed by #31089
Open

Renderer: Add MRT support to .readRenderTargetPixels(). #22403

LuckyTeresa opened this issue Aug 24, 2021 · 8 comments · May be fixed by #31089

Comments

@LuckyTeresa
Copy link

Describe the bug
I used WebGLMultipleRenderTargets to output several textures , then I set one of these textures to WebGLRenderTarget's texture,
then I passed these WebGLRenderTexture to WebGLRenderer.readRenderTargetPixels to get one pixel’value, but I only got (0,0,0,0) from the Float32Array I have passed

To Reproduce

Steps to reproduce the behavior:
1、if "rt" is a WebGLRenderTarget, setRenderTarget to "rt", eg. renderer.setRenderTarget(rt);
2、render something to it;
3、if I have another WebglRenderTarget "rt2", rt2.setTexture(rt.texture);
4、const read = new Float32Array( 4 );renderer.readRenderTargetPixels( rt2,posX, posY, 1, 1, new Float32Array( 4 ) );

or you can copy following code to three.js/examples/webgl_read_float_buffer.html, and replace the script part. I have only changed a small of them

Code

// code goes here
                       <script type="module">
			import * as THREE from '../build/three.module.js';

			import Stats from './jsm/libs/stats.module.js';

			let container, stats;

			let cameraRTT, sceneRTT, sceneScreen, renderer, zmesh1, zmesh2, rtTexture2;

			let mouseX = 0, mouseY = 0;

			const windowHalfX = window.innerWidth / 2;
			const windowHalfY = window.innerHeight / 2;

			let rtTexture, material, quad, materialScreen;

			let delta = 0.01;
			let valueNode;

			init();
			animate();

			function init() {

				container = document.getElementById( 'container' );

				cameraRTT = new THREE.OrthographicCamera( window.innerWidth / - 2, window.innerWidth / 2, window.innerHeight / 2, window.innerHeight / - 2, - 10000, 10000 );
				cameraRTT.position.z = 100;

				//

				sceneRTT = new THREE.Scene();
				sceneScreen = new THREE.Scene();

				let light = new THREE.DirectionalLight( 0xffffff );
				light.position.set( 0, 0, 1 ).normalize();
				sceneRTT.add( light );

				light = new THREE.DirectionalLight( 0xffaaaa, 1.5 );
				light.position.set( 0, 0, - 1 ).normalize();
				sceneRTT.add( light );

				rtTexture = new THREE.WebGLRenderTarget( window.innerWidth, window.innerHeight, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBAFormat, type: THREE.FloatType } );
				rtTexture2 = new THREE.WebGLRenderTarget( window.innerWidth, window.innerHeight, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBAFormat, type: THREE.FloatType } );

				material = new THREE.ShaderMaterial( {

					uniforms: { "time": { value: 0.0 } },
					vertexShader: document.getElementById( 'vertexShader' ).textContent,
					fragmentShader: document.getElementById( 'fragment_shader_pass_1' ).textContent

				} );

				materialScreen = new THREE.ShaderMaterial( {

					uniforms: { "tDiffuse": { value: rtTexture.texture } },
					vertexShader: document.getElementById( 'vertexShader' ).textContent,
					fragmentShader: document.getElementById( 'fragment_shader_screen' ).textContent,

					depthWrite: false

				} );

				const plane = new THREE.PlaneGeometry( window.innerWidth, window.innerHeight );

				quad = new THREE.Mesh( plane, material );
				quad.position.z = - 100;
				sceneRTT.add( quad );

				const geometry = new THREE.TorusGeometry( 100, 25, 15, 30 );

				const mat1 = new THREE.MeshPhongMaterial( { color: 0x555555, specular: 0xffaa00, shininess: 5 } );
				const mat2 = new THREE.MeshPhongMaterial( { color: 0x550000, specular: 0xff2200, shininess: 5 } );

				zmesh1 = new THREE.Mesh( geometry, mat1 );
				zmesh1.position.set( 0, 0, 100 );
				zmesh1.scale.set( 1.5, 1.5, 1.5 );
				sceneRTT.add( zmesh1 );

				zmesh2 = new THREE.Mesh( geometry, mat2 );
				zmesh2.position.set( 0, 150, 100 );
				zmesh2.scale.set( 0.75, 0.75, 0.75 );
				sceneRTT.add( zmesh2 );

				quad = new THREE.Mesh( plane, materialScreen );
				quad.position.z = - 100;
				sceneScreen.add( quad );

				renderer = new THREE.WebGLRenderer();
				renderer.setPixelRatio( window.devicePixelRatio );
				renderer.setSize( window.innerWidth, window.innerHeight );
				renderer.autoClear = false;

				container.appendChild( renderer.domElement );

				stats = new Stats();
				container.appendChild( stats.dom );

				valueNode = document.getElementById( 'values' );

				document.addEventListener( 'mousemove', onDocumentMouseMove );

			}

			function onDocumentMouseMove( event ) {

				mouseX = ( event.clientX - windowHalfX );
				mouseY = ( event.clientY - windowHalfY );
				// console.log(mouseX, mouseY);

			}

			//

			function animate() {

				requestAnimationFrame( animate );

				render();
				stats.update();

			}

			function render() {

				const time = Date.now() * 0.0015;

				if ( zmesh1 && zmesh2 ) {

					zmesh1.rotation.y = - time;
					zmesh2.rotation.y = - time + Math.PI / 2;

				}

				if ( material.uniforms[ "time" ].value > 1 || material.uniforms[ "time" ].value < 0 ) {

					delta *= - 1;

				}

				material.uniforms[ "time" ].value += delta;

				renderer.clear();

				// Render first scene into texture

				// renderer.setRenderTarget( rtTexture );
				renderer.setRenderTarget( rtTexture2 );
				renderer.clear();
				renderer.render( sceneRTT, cameraRTT );

				materialScreen.uniforms.tDiffuse.value = rtTexture2.texture;

				// Render full screen quad with generated texture

				renderer.setRenderTarget( null );
				renderer.render( sceneScreen, cameraRTT );

				const read = new Float32Array( 4 );
				console.log(windowHalfX + mouseX, windowHalfY - mouseY);
				rtTexture.setTexture(rtTexture2.texture);
				renderer.readRenderTargetPixels( rtTexture, windowHalfX + mouseX, windowHalfY - mouseY, 1, 1, read );

				valueNode.innerHTML = 'r:' + read[ 0 ] + '<br/>g:' + read[ 1 ] + '<br/>b:' + read[ 2 ];

			}
                       </script>

Live example
no live example,I cann't visit these links

Expected behavior
I hope I can get a effective value from the ArrayBufferView object, rather than 0.

Screenshots

Platform:

  • Device: [Desktop]
  • OS: [Windows10]
  • Browser: [Chrome]
  • Three.js version: [master, r130.1]
@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 24, 2021

I'm afraid this workflow is not supported. Extracting pixels from MRT color buffers requires the usage of gl.readBuffer() which is not used by the engine so far.

For now, try to implement a custom function based on: shrekshao/MoveWebGL1EngineToWebGL2#4 (comment)

@LuckyTeresa
Copy link
Author

Yeah, the approach mentaioned in the link solved my problem, thanks a lot !

Repository owner deleted a comment from LuckyTeresa Aug 25, 2021
@Mugen87 Mugen87 changed the title renderer.readRenderTargetPixels from a WebGLRenderTarget which texture is setted by setTexture method get (0,0,0,0) WebGLRenderer: Add MRT support to .readRenderTargetPixels(). Apr 7, 2022
@Mugen87
Copy link
Collaborator

Mugen87 commented Apr 22, 2024

This issue should also honor readRenderTargetPixelsAsync() of WebGPURenderer, see #28180.

@Mugen87 Mugen87 changed the title WebGLRenderer: Add MRT support to .readRenderTargetPixels(). Renderer: Add MRT support to .readRenderTargetPixels(). Apr 22, 2024
@Spiri0
Copy link
Contributor

Spiri0 commented Apr 23, 2024

I see that here an attempt is also being made to read back float32 values ​​and, as far as I have seen in the code, this is not supported. When using Float32Array only zeros are returned. Uint8Array must be used for readRenderTargetPixels.

However, if a float32 value is to be consciously read back here, then I recommend renaming the issue.

@Mugen87 have you merged the small extension I made for readRenderTargetPixelsAsync in #28180 into r164 ​​or should I? I can do it if you wish.
I have tested my extension in readRenderTargetPixelsAsync extensively in my WebGPU app over the last few hours. This little thing made a big difference

@Mugen87
Copy link
Collaborator

Mugen87 commented Apr 23, 2024

@Spiri0 Feel free to make a PR!

@Spiri0
Copy link
Contributor

Spiri0 commented Apr 25, 2024

If this is just about reading out multiple render targets then that would be done now.
The goal of @LuckyTeresa was to read the pixel color of the pixel where the mouse is. When using readRenderTargetPixels you have to use Uint8Array because the RGB values ​​are only in the range [0 - 255]. That's exactly where the problem with the use of Float32Array lies. With Float32Array you got a type mismatch and therefore [0, 0, 0, 0]. I had exactly the same experience 😁

@Mugen87
Copy link
Collaborator

Mugen87 commented Apr 25, 2024

We still need adding support in WebGLRenderer.

@Spiri0 Since you are already into the topic, are you interested in giving it a shot? 😇

@Spiri0
Copy link
Contributor

Spiri0 commented Apr 25, 2024

I can take a look at that. There is an analogous example in WebGL about multiple render targets. readRenderTargetPixels in WebGL works significantly differently than readRenderTargetPixelsAsync but I'm trying. I'll take a look at it today

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants