I have added a SceneCaptureComponent2D in vr pawn and writing it to a render target and then exporting to .hdr but I can see unreal engine components but the passthrough is black. I need to take screenshot of what player is seeing with passthrough and convert it to base64. How can i do this in blueprint.
I changed the setting as per this and packaged the project to quest 3 and when I try to run, it crashes and doesn’t launch but if I keep the render target and scene cap setting default, it is working but png is not exporting properly
this is a common question.
Meta and Apple Vision Pro don’t allow users to access raw camera buffer thus you can’t access passthrough view programmatically. I wrote details in here.
For your case, Quest handles passthrough view at low level and because softwares doesn’t have access to camera buffer, you can’t see that view in your screenshots.
I didn’t try yet but if you use Quest Link (Passthrough supports it) you can try to take actual screenshot from Windows.
I don’t wanna access camera. Can i Take screenshot of what player is seeing in the head mount display in anyway? My app uses passthrough and everything is working fine.
You indirectly want to access camera buffer.
You have a USceneCapture2D component to capture what user see in your VR scene. If it was just a VR scene, it would be okay but for RGB passthrough, engine have to access front facing raw camera buffer and there is no access for it.
So, you will see your digital objects in front of black screen.
You can see passthrough because Quest device handles it but not engine. So, engine see passthrough as empty area.
No. As I say before in here and my other answers for other related questions. This is not an Unreal Engine or any game engine related function/problem. Meta doesn’t allow us to access passthrough buffers from RGBD cameras programmatically.
You can record that view with your Android/iOS phone with Meta Quest app to share on YouTube or Twitch but that’s all.
You can’t access camera passthrough buffer with screenshots. It’s an internal system of headsets. You have to use camera2 api. But currently Meta haven’t released a plugin for that for UE5.
You have to write by yourself with C++ and Java JNI. Also there are some experimental plugins in GitHub.
But there will be another problem. camera2 api won’t have your UI data. So, you have to write a logic like this in C++
Don’t take screenshots. Use capture scene component and take a capture. It has the be same eye of camera2api
You will have a render target with UI on blackscreen. Get its pixel buffer.
Change camera2 api’s buffer’s pixels with UI buffer where UI’s pixel are not black. This makes a some kind copy paste with masking effect.
If you don’t know C++ and this is not a runtime gaming feature (I mean you want to use it for advertising your project), I suggest you to use Meta’s “Camera” app.
It is already installed. You can open secondary apps as overlay. Open it and take screenshot with it.
Check this plugin. It gives you access to the Android Camera2 Api. So you will have access to the Headset Cameras of Meta Quest 3 (Passthrough). Also works for any Android Device like phones (Check main branch)
2- I can already write my own camera2 api plugin with FRunnableThread support.
3- This plugin uses game thread. You get camera buffer, do possible process like QR detection and mechanics on same thread ?? I don’t how things can go worse for performance perspective. It is a good example though and I am aware of it. That’s why I said “there was some experimental plugins”.
4- Question was about different things than what plugin was doing. Camera2 API gives you what “camera” sees. It is a raw view without UI elements. Question was about having “both” camera view and UI. That’s a different story. For this reason you have to replace camera’s pixel with UI’s pixels in another thread after getting UI pixels from render thread.
5- These operations can’t be done with current screenshot system.