Ex. 1 - For something like a security camera rendering its live view onto a monitor in the Control Room.
Ex. 2- Player walks around level,gets bored, sits on the couch and starts a mini game. The player is looking through the camera of the original Human Player in the game but in front of him is a tv that turned on when it was triggered. Now the player is able to play a mini game(like if he was playing on a console inside the game) as if he was living the life of the game player until he triggers the game player to get up and move away from the couch.
Its in the blueprint office example downloadable from the marketplace. There are cameras that detect you and there are monitors you can play with that show what the cameras see.
Sorry to raise Lazurus from the pit, but I was looking into , and I can’t make much from the BPs in that example. What portion of the BPs actually capture the screen? Is it the Camera BP or the Monitor BP. Since is the Blueprint Example, I’m assuming is all able to be accomplished without code.
So I have some questions
The console has two materials assigned to it. One for the screen, and one for the body. But within the BP, there are two material variables for the screen. How are these even associated with that specific material? How are they assigned? And there are 3 render textures, but I can’t tell how they are linked to any camera or monitor. I’m also not seeing where the monitor picks up the texture. I gets the camera from an array, but where in the camera BP do we “see” to capture a texture? I only see the movement, alarm and vector drawing portions. I’m trying to pick out pieces, but I’m having a hard time reverse engineering what’s going on here.
For intents and purposes, I just wanted to recreate the a flat plane and just render the camera view on to the plane just to play around with the , but can’t even get that.
In reference to the first link. That was easy to set up. Just a question on that though. How do you apply post processing materials to that? Like the VCR feed example in the documentation? I don’t know if thats how the BP example did it. Simply looking at it, I was going to just apply a PPM, using the sample texture, on to some surface, but that doesn’t render, and I’m left with the default checkered texture. particular method only seems to work with Surface Materials. Did I overlook something?
I could be wrong about (I have not yet tried to do ), but I believe you would apply the post process to the camera itself, so that it will project those onto your monitor. In the options for a camera there are settings such as “Grain Jitter”, “Grain intensity” and “Color Grading” under the scene color tab.
Again, there is probably a different (if not better) way of doing that I am unaware of, hopefully someone else can add their thoughts on .
I had seen those, they just didn’t provide in-depth ways to control the material, only some simple post . For stuff such as video static, scan lines, emissive, infrared, etc doesn’t seem possible through the camera alone. I’ll have to look into it some more. post for example has a person wanting to make night vision. I know in my instance, I’m not attempting that, but the person who replied appears to be using a PPM for the camera, which for me doesn’t work.
What’s a performant way of checking how much light is falling on an npc or the player character in real time and would that only work with fully dynamic lighting or static lighting lights also?
Not sure why you’re asking about detecting light levels in thread about render target textures
A quick google search brought up the following as the latest UE4 related post about such a thing: https://answers.unrealengine./questions/478953/get-light-intensity-at-specific-location-or-detect.html
OK, I hate to rehash old thread, but it’s what has me closest to what I’d like to do.
I have a PyActor blue print that periodically updates a list of text objects(from a 3D text plugin, so it’s geometry and not Unreal’s horrible rendered text) that I would like displayed on an object in a level. To picture in your mind, think of like a few REAL news headlines being pulled from a data service that periodically update in real time that get displayed on a monitor in game.
I don’t want those text objects hanging out there in the level anywhere - my level is very sparse, and I don’t want the players view to accidentally be able to see it, only when it’s on the designated object (like a monitor) or movie screen). I was hoping I could use blueprint to create a render target; the blueprint itself could be in the game - but not visible, just providing real time updated render views.
Is it possible? The alternative is putting the screen capture camera and the actual text way out in the middle of nowhere, perhaps masked by something - but that seems really hoakey and would be hard to edit/update in the editor (having to jump from 0 to very far away and back).