-What could be the problem? Is the hardware not enough for the capturing?
No, this problem Camera Cuts. Donāt worry. Deactivate Track. Donāt delete.
-Is there maybe a sample scene available so we can try if the settings are messed up?
Demo maps in Camera_360/Maps/Demo_Scenes
@ElizzaRF Thanks a lot for the support, and quick reply!
Before I posted my first comment, I used Get Target All Transformand to select the camera, but it didnāt seem to recognize, or update the connection between the two objects. (By the way, Iām using 4.25.1 version).
But anyways, I managed to get a capture working in the meantime by adding the āCamera_Point_360ā object itself into the Sequencer like in the video link you provided, and managed to export a nice panoramic sequence. So this method seems to be working fine!
The tutorial you linked, by attaching āCamera_Point_360ā to a CineCameraActor in the Sequencer, and deactivating the Track also works. Regarding this method: is it possible to animate camera exposure? I canāt add Camera Component to āCamera_Point_360ā, and it does not seem to adapt animated exposure from the CineCameraActor.
Neither adding PostProcessVolumes to the level, seems to have effect on the rendered output.
Also, a question: Is it possible to narrow the Field of View, for Dome Camera, or VR180_Stereo outputs?
We are making a VR animation for Oculus Go, which seems to have an upper limit of video resolution 3840x3840 or 5760x2880, which does not seems to be enough for a VR180, so Iām thinking of projecting the rendered video on a smaller segment of a sphere (for example 135° by 135°), so like this we can make a better use of Oculus Goās maximal video resolution.
Of course I can capture a full 180° at a higher resolution, and crop the image sequence afterwards in a video editing software to have a more narrow field of view, Iām just wondering if there is a more effective workflow for this.
Thanks for the great plugin and for the support in advance!
](filedata/fetch?id=1787293&d=1594586164)
-5. Donāt forget. Infinite Extent(Unbound) for apply the post process to the entire level. And all working. Or Actor camera rec 360 paste to post process zone. ](filedata/fetch?id=1787294&d=1594586285)
It is better to do this through the post process, so that all cameras that are used in camera 360 have the same exposure. And itās better to do the Manual method.
I need to think about it. Thank you for the idea, but now unfortunately there is no such opportunity to reduce the field of view.
I am constantly working on improvements and always happy with new ideas.
Thank you for your kind words and purchasing the 360 Camera. Iām always ready to help
First, thanks for this plugin! Iām really excited about it.
I used the ArchViz Project from the Learn tab, but there is quite a bit of noise, especially on the ceiling and walls, but not so much in other places. I donāt have a top of the line computer but is good. i9 processor and RTX 2080 max Q (MSI laptop).
My best guess is, is that I canāt seem to put the Texture quality over 2048 without it crashing, Iāve tried changing AA, Iāve tried using the high-quality settings as well and those donāt seem to make a difference.
Good afternoon, michalex19!
Thank you for your purchase. Email me Lenina62-ivan@mail.ru if you canāt apply these settings.
You have a great computer, but since raytracing is very demanding, you need to set the settings very carefully.
***You have several ways to create a rendering. ***
Movie Redner Pipeline (In Unreal Engine 4.25) (youtube tutorial)
Step:
Add in scene acotrs (content/Camera_360/Blueprints) : Camera Rec 360 and Camera point 360
](filedata/fetch?id=1790524&d=1595249923)
8. Accept and Start Rendering.
This solution best clear image full raytracing used Movie Render Pipeline.
Example youtube tutorial.
Tell me, did the solution help you? If not, write to me personally, I will contact you online. [EMAIL=āLenina62-ivan@mail.ruā]Lenina62-ivan@mail.ru
I lost my project with settings. I will publish an accurate video lesson soon to avoid noise. Please waiting video tutorial.
Hi, I just wanted to be sure if the Camera 360 is guaranteed to be āframe lockā. Means that the game-play fps is lock to the rendering fps (like 24 fps for instance) and it cannot have frame de-sync because of a game-play lag, ⦠So that two different rendering have the exact same frames and timing. Can you guarantee that ? Or you know itās not āframe lockā ?
Movide Render Pipeline https://youtu.be/a_vp7b5Blyg
Here all frames correspond to the frequency specified in these systems. If 24 frames, it will do 24 frames per second. If 30 frames, it will do 30 frames per second and so on. Freeze frame is not used. There are no missing frames.
HighResShot system https://youtu.be/J7uEiOhkH8o and Custom Rendering https://youtu.be/eUlgjTN-4Qg
The timeframe may differ by several frames, since the game is suspended. But not much. If the window is not active, frames may be skipped. This is why the window must always be active. You cannot use other applications.
All systems there is no desynchronization. Thatās right, even if your scene is very heavy, only 5 frames per second, rendering will have a specified frame rate, such as 30.
NEWS 07/24/2020. DOF (Depth of Field) for UE4.23, 4,24 and 4,25.
Iām running depth-of-field tests. Iām still checking everything.
The good news is that the DOF will be supported. Video:Tutorial 17.Camera 360 and DOF. - YouTube
But there are some notes:
the color scheme will be slightly different from the standard one;
You need select Mode -> example: 360 Mono (Use 6 cameras).
Custom Render -> True
FullPostProcess -> True
Change values for DOF.
Note: Not working in 360 Mono (Use 1 camera). DOF raytracing support, but for record you need used sequencer or movie render pipeline. I need to change the logic of creating screenshots for raytracing + custom render and dof.
Hi,
I followed your āTutorial 5. Raytracing. Camera 360 for Unreal Engine 4ā applied on your Demo_Scene, I modified a bit the walls with a glossy material, but in the 360 generated image i canāt see the raytracing effects(reflections, shadows).
Some Post processing does not work, artifacts appear in the form of seams.
Vignette = 0 (In Post Process);
Bloom = 0 (In Post Process);
ScreenSpace Reflection = 0 (In Postprocess);
Atmospheric Fog - volumetric off;
Exponecial Heigh Fog - volumetric off;
Directional Light - Light Shafts Off;
Exposure (Sometimes due to the uneven distribution of exposure occurs in the seam. Try to adjust the check Exposure method Manual and change Exposure value).
Thanks again for your amazing work and your help for my last post on your forum. I manage indeed to solve our problem with artefacts with the HighRes system.
We have a new problem on the image below with a very blurry render of our countryside map with the 4.23 version of unreal.
We have not this problem when we render the same map on 4.25 or when we play in editor or new window or in VR with the 4.23.
We tried :
All your new methods to have quick render (movie render, High Res, Sequencer)
Try to change parameters of each method each step
Erase artefact possible origins : post process, shaft, vignette etcā¦
Try to change your blueprints parameters and links in the CameraRec 360 blueprint to see if there is a link
Try to set excute command inside blueprints to force PostProcess, Antiliasing etcā¦
Do you have any clue to fix it ?
Thanks again for your help.
Hi,** HeaMind! **
Thank you for your feedback!
Send me an email Lenina62-ivan@mail.ru I could connect to you remotely and check what the problem might be.
I think itās Depth of Field you have installed in the Post Process. You need to set everything to 0