Most hilarious post in recent memory XD
Have to second that. Blakblt has earned a cookie.
I made all assets in blender,tomorrow I will import assets in ue 4 and set up the scene.I will post download link here.
man! Will push the commit today, with the tweak options. Also, reflections from not lit stuff is making a comeback, had it disabled for a while, will re-enable it now.
EDIT: The tweak options are already there, check them out. The reflection thing might take a bit longer, but will probably be tomorrow.
,new toy to play with.
Hey people, got some screens for the ones wanting a AHR vs VXGI comparaison
Well, took a test at the new VXGI build, loaded the sci fi hallway scene, and while it looked quite good, was surprised by the perfomance, or the lack of it really
Here are some screens of that scene in VXGI and AHR:
AHR
http://i.imgur.com/nH9N358.png
VXGI
http://i.imgur.com/weMfwJ3.png
I do recognize that VXGI looks better here, but keep in mind that is quite a stressful scene for AHR quality, as the normal maps are mostly flat, and I count on them to break the noise. I’m thinking of doing something as micro-variations of the normal based on roughness. We are using a microfacets model after all…
Really need to work on improving the bilateral blur, looks quite bad, too much bleeding. Not sure why I can’t get it to work properly, that’s quite a standard thing…
Anyway, got some thinks on the pipeline to improve quality and performance. One of them is something I have tried before, and worked quite good, but had some problems I’m yet to solve, and that is progressive refinement. In short, changing the ray kernel every frame, and if the pixel haven’t moved, use the prev results. That has the direct effect of increasing the ray count. With a few frames, you can increase it quite a bit and still have a dynamic scene, but there are some subtleties that I need to address.
The other thing is something I like to call “Weighed Bicubic Push-Pull”. It’s the good ol’ push-pull upsampling, but modified so that it uses bicubic filtering on the push phase and takes into account depth discontinuities. The idea with that is to not compute some random pixels on the screen, and then use that filter to fill the empty spaces. idea it’s something of a long shot compared to the others, but we’ll see.
you are just begining to optimize ahr and it is already considerably faster than vxgi :eek:.
just doing my best
I am still working on my scene.I can post screenshots from blender if you want, it is not the best
That would be cool, want to see what you’re doing. BTW, Blender rocks \m/
Looking really good! As a tip, AHR produces better results for rough surfaces, and also try to use normal maps, will help break the noise.
Thank you!
You should read tomorrows children research paper now that you are focusing on optimizations ,since theirs are already tested and may clash with yours if you try to implement them too late .
http://fumufumu.q-games.com/archives/TheTechnologyOfTomorrowsChildrenFinal.pdf
I get 79 fps in that scene with vxgi with my gtx 980 If your card isn’t a 9xx that’s why the performance is bad.
Sure, my card is a budget one ( GTX 750 Ti) but consider there’s a 3 times difference. Not sure how it will hold up, but a while back when a friend tried a Titan Black with my demo scene (not UE) it ran about 10 times faster. 79 is about 8 times faster than my FPS with VXGI, so it’s likely that will hold up. Not saying it will still be 3 times faster, but will probably be faster. If you could try it and post results?
I took a look at it a while back, but will see it again now, it has some interesting stuff.
EDIT: Seems to be a newer paper. Even more interesting, nice catch
thank you:)
I just replaced engine folder and I got a bunch of errors. http://imgur.com/GWvpIMd
I just needed to do . http://imgur.com/5ZWwSWU