Having hardware ray-tracing support on a card is the biggest development for computer graphics in well over a decade (and that’s not even mentioning the Tensor cores which opens up a world of possibilities in other industries as well). Sadly, very similar to when programmable pixel/fragment shaders came out, no one really got it and most gamers argued that no one would ever need such a thing, now you wouldn’t ship a game without using the programmable pipeline. Sadly, it’s the same thing here and people are very quick to dismiss the technology without even understanding what it brings to the table, not just for games, but to a lot of different industries.
From what NVIDIA have said (NVIDIA Reinvents Computer Graphics with Turing Architecture | NVIDIA Newsroom), compared to a typical CPU-based ray-tracing server used by Hollywood, a single 2080 ti is 30 times faster at ray-tracing. That is not a small number. To put this into context for something like Lightmass, that would mean cutting down our lightmap generation times from around an hour to around 2 minutes. That is a massive productivity gain for developers, I really can’t wait for this to get added to Unreal, though I’m not sure if Lightmass RTX support has been confirmed yet. I know several commercial 3D renderers are adding support for it and Ton from Blender has hinted that it will be coming to Blender’s Cycles renderer as well. So, a graphics card is pretty much going to replace massive servers all over the globe, not to mention putting this kind of power in the hands of the average indie-developer.
It will also be really cool to see how RTX support is added to engines as people get familiar with the technology and figure out how to use it properly. It’s still early days, but real-time global illumination is actually turning into a real possibility here. Metro Exodus has already demonstrated it on RTX, so it would be good to see it implemented in Unreal as I feel like real-time global illumination was always one of the big promises of UE4 that just never ended up happening. So it would be good to see it happen now that the hardware is finally here.
This WOULD all be true if the technology was actually available to the industry as opposed to being held hostage by a company that’s proven time and time again to be more than willing to ruin competitors and screw over customers for a profit. As long as it’s not open and multiplatform, it’s nothing more than a proof of concept.
The hardware might be available for developers and enthusiasts but it’s not even close to being commercially viable if only one vendor holds the keys.
Sorry, but that’s just not true. There is absolutely no reason this could not be implemented in generic form in for example and make its way to Linux and run on an AMD card.
Anyone can walk into a shop and pick up an RTX card right now, it’s not a theoretical product that’s only available to developers. It is true that right now only one vendor has implemented the technology, but they’re not ‘holding the keys’ or preventing other companies from implementing ray tracing. They specifically chose to work with Microsoft to make sure there is a generic API in place so that other vendors can implement ray tracing in a way that can be supported by everyone.
It just seems to me like you’re looking for a reason to complain about NVIDIA. But there is nothing they’ve done that makes the tech specific to Windows and they have gone out of their way to make sure that software built today will just work with cards from other vendors when they implement the technology as well. I’m not saying you have to love the company, but right now you’re trying to push a point that is already proven false.
As for it not being commercially viable: industry support outside of gaming has been huge. Almost every major commercial ray-tracer have already announced support for the technology and I can’t see why the industry would not adopt it. NVIDIA currently sells a server with 8 of these GPUs which is able to replace 240 CPU-based rack servers. And it’s not just the cost of the computers, it’s the cost of electricity, the cooling required, the space needed to store those servers, people needed to manage those servers. I don’t see how you can think it’s not commercially viable technology.
I also disagree that NVIDIA holds technology hostage as you put it. Historically this has not been true, and if we look at something like OpenGL for example, NVIDIA would initially implement a vendor specific extension prior to the point where there is agreement on how a technology should work generically, then Microsoft usually came along and implemented their variant and forced vendors to conform to that API, and then after discussions with everyone in the OpenGL ARB, a generic extension would be created that can be used by everyone. NVIDIA have never prevented a generic extension from being created which was based around their vendor specific extension and the only reason they created vendor specific extensions was because it was easy to do and didn’t need agreement from a dozen other companies as well.
If anything, NVIDIA have been more open this time by not just implementing this in isolation as an extension, but instead they are already working with people like Microsoft to make sure there is a generic API in place from day 1. That means any DXR based games released on Windows today will just work and support ray tracing when AMD for example implement the technology as well. As for , similar to OpenGL, a generic implementation will be in place once everyone agrees on how it should be implemented.
Can you be more specific? Also, feel free to drop you comments and suggestions in our Documentation Feedback section of the forum. We regularly monitor those posts and respond.
I’m sure it will come eventually, but Microsoft will be focusing on DirectX, which is Windows and Xbox only.
Being able to pick up != widely accepted. Only a very small % of the enthusiast market will have access to RTX now. It will be decades before you can claim that every gaming rig on the market will have RTX support, like every gaming rig today supports programmable shaders.
As if I need to look for a reason. Their partner program is all the reason I will ever need.
nVidia has GameWorks, a closed source technology available only on PC for the most part and even on PC it has shown bias for their own hardware (looking at you, Witcher 3 HairWorks fiasco). Where is their equivalent of GPUOpen?
In the end, it’s important to note that we are still in the very early days of hardware ray tracing. Though I don’t believe it will take decades, maybe 3 years or so.
Yeah, well, I can’t really argue there, GameWorks is a complete ■■■■, but RTX is something completely different and unrelated. NVIDIA have, from what I’ve seen been quite open to other companies implementing hardware ray tracing. And while DXR is Windows-only, I believe it is a major step in the right direction to have a generic implementation for using ray tracing in games from day 1. That isn’t something we’ve ever had in the past. As for , it’s only a matter of time until they have some sort of equivalent to DXR.
Of course, there’s a lot more to those RT cores than DXR and mixing ray tracing with rasterisation. So I agree that right now it’s hard to say if there will be an open equivalent to NVIDIA OptiX (which allows you to use the RT cores for general purpose ray-tracing for things like Hollywood productions and maybe Unreal Lightmass). Ultimately though, it probably wouldn’t matter as there are only a few renderers available, and they can add vendor-specific support as required in a worst case scenario.
Anyway, let’s just give it time, they have a promising start going, let’s see how to evolves.
When Epic presented Star Wars demo half a year ago many people were saying things like: “What’s the point of it? You need incredibly expensive hardware to run such demo. It will take years before this demo can be rendered real-time at home”.
It’s already possible for ost $1000. Of course “a very small & of the enthusiast market will have access to RTX now”. It just got released. My friend-enthusiast got his RTX card yesterday although he ordered it weeks ago. Nothing wrong or odd with that - this is natural lifecycle of technology.