Nvidia GameWorks in UE4

So I’ve just read this article and noticed the note at the end about UE4 implementing GameWorks. Is this something we should worry about after reading this article?

Quick excerpt from the article

UE4 Relevance

Read the rest here

Just a dev in the early stages of a build wondering what to make of this.

… and this is why I don’t buy nVidia hardware. Sure seems like some Apple-level scumbaggery going on…

I know Epic (UE4) has a “custom” software solution for PhysX so that it doesn’t run on the GPU, but instead on the CPU, thus everybody gets the benefit of it and not only NVIDIA users. I’m almost sure they’ll provide the same solution for any other Gameworks tech they decide to include and I hope they do include some of the amazing tech ( APEX, GI, Faceworks, Flameworks, etc. )

I’ve been an ATi fanboi for more than 10 years, and rest assured that the only thing crippling ATi card’s performance by 40% and more are their crappy drivers.

That’s not something Epic added, if you don’t have an Nvidia GPU then it defaults to using the CPU for all games.

I don’t see any concerns.
Consoles run on AMD hardware. Most AAA games are made with consoles in mind. UE4 is (whatever you may think), is still engine of AAA grade game development. Which means it will be widely used on consoles, and that consoles still be lowest denominator in terms of fancy graphical/physical features, even if there will be more advanced options available for use just after you install the tools.

Besides that. If you don’t want to use Gameworks, noone is forcing you. But it’s better to have preintgrated and maintained option, than integrate it yourself and then.

I for one would gladly see ShadowWorks integrated into engine.

I just wanted to clear up a few things!

UE4 does not have Gameworks “built into its core”. We do use PhysX at the core of UE4, but that is a cross-platform, CPU-focused, rigid-body and collision engine, and we work closely with NVIDIA to ensure it runs well on all platforms we port to. Currently in UE4 NO PhysX or APEX feature uses the GPU.

I’m very excited about some of the simulations NVIDIA have shown on a GPU, and I’d love to make them standard features in the engine. We have a great relationship with NVIDIA going back many years, and we are talking to them about a way to bring that tech to UE4 in a cross-platform way. We have no plans to implement them until we come up with a solution to that problem though.

James, Seriously Brudda check out Erwin Coumans experiments with Bullet physics engine 3 GPU code (it’s work in progress but Opencl driven GPU collision detection and rigid body’s is working as i speak). GitHub - bulletphysics/bullet3: Bullet Physics SDK: real-time collision detection and multi-physics simulation for VR, games, visual effects, robotics, machine learning etc. for official work / erwins fork version GitHub - erwincoumans/bullet3: Bullet Physics SDK: real-time collision detection and multi-physics simulation for VR, games, visual effects, robotics, machine learning etc..

I said it before and ill say it agian your absolutely right not to include gpu accelerated Physx to UE4, it would be a massive disadvantage for any AMD/Intel GPU owners and would just create a barrier in dev and the end users mind that would look like Epic siding for performance with Nvidia at the expense of the other 70% of the market which would be suicide.

One way Nvidia could solve it is stop trying to push software that only works with it’s proprietary hardware, which only makes devs and end users of anything other than nvidia kit ****** off, Maybe they should do what intel and AMD do and share code for their developments that work on all GPU’s and stick to what they should be doing anyway MAKING good hardware that people want to use, Not that they have to use due to the hardware manufacturer trying to lock devs into expensive licensing and end users into the situation of if your not nvidia get stuffed. Anybody ever wondered why the Linux community hates Nvidia with a passion?

This is fantastic to hear! Thanks for taking the time to respond James!

I have tested both AMD and Nvidia cards and I have seen that there are no real advantages over the Physx used in UE4 all the effects looks similar to me. only difference I have noticed is that Nvidia cards have better performance over AMD cards that is all. I am running a ATi card btw xD

I don’t see any problem in adding GPU acceleration for PhysX. The problem would be if you had to have an Nvidia GPU to use PhysX at all. I think that’s the point of whether or not they add Gameworks support to UE4–if it’s added then it has to work on both AMD and Nvidia graphics cards.

Thanks for the clarification, James! Does it mean that some specific Nvidia techs (*thinking this way because they appear in very few games out there) like HBAO or advanced soft (contact-hardening) shadows won’t make it into UE4.x builds?
Correct me if I’m wrong, speaking of UE3 I thought that these ‘features’ were licensed for certain devs (like WB), thus all the advanced Nvidia tech appeared only in games, and not in free UDK.

Well maybe you know something I don’t, your argument is just a way of saying people are lucky Physx runs even on the CPU. Physx for years was slated because it’s so called cpu support never worked properly and performance was rubbish, did Nvidia ever really put effort into solving that? like… did they.

I still for the love off… dont understand what nvidia’s plan is, they keep pushing more and more code that is only Nvidia hardware compliant which annoys all except the nvidia sycophants in a market that only has 2 players (3 if you call intel a player, but intel own nvidia anyway so you get the point).

If nvidia’s tactics ever did come to fruition and amd got put out the gpu market then it would cost nvidia/intel even more from the lawsuits and competition regulatory bodies fines for market fixing and for running a monopoly. Why do you think intel GAVE amd the cpu market share they have now, they could take 99.9 of the market anytime they wanted (it’s on record that the only reason amd has any cpu market share is because intel let them have that tiny fraction to avoid monopoly competition law. Then we have amd who can’t build drivers to save their lives, let down the customer base and don’t really care because their to busy trying to keep alive, that doesn’t work in the consumers interest either.

To be honest I’m ****** of with the whole gpu segment, for the love of god if there was ever a need for a new tech company in this market it’s now!, i didn’t buy the raspberry PI for it’s performance but it’s ethos and fun. There’s got to be some tech firm in the UK or silicon valley that says lets take this screwed up market on!. It’s now 5.23 in the morning after a good night of beer’s and grub, Rant over.

For now!

Love to all Indie Devs!!!

No, the argument is that PhysX runs on the CPU, and if you have an Nvidia GPU then you can run it even better. There’s no bugs or anything as far as CPU support goes, it just doesn’t work as well as GPU.

And you’re confused about what Nvidia’s plan is? What’s hard to get that they want to create incentives for people to buy their graphics cards?
There’s nothing illegal if AMD went out of business because of it–if Nvidia makes a superior product and people choose to buy that then that’s how the world works.
Remember back when 64-bit and dual-core processors first came out? Intel was doing horribly, and AMD had the best processors available–what happened after that? When it was time to develop new processors Intel created a much better product, and they’ve been ahead in that regard for a while.

AMD clearly does care, in the last year or so their graphics cards have improved greatly where they are competing much better with Nvidia, and they’ve got processors in next gen game consoles now.

Also, Intel does not own Nvidia. They’re direct competitors ever since Intel released their Sandy Bridge processors which eliminated the need for dedicated GPU’s and they both compete against each other in the mobile market.

Like James said all Particles and Physics run by default on CPU. By default only renders happen to GPU.

Gameworks is an amazing thing by Nvidia if you get a license and it’s fully implemented it will be really amazing.

Here’s how I see it… Nvidia is actually offering something great to the gaming industry by providing excellent “next-gen” graphical (and some gameplay) features that work out of the box… which is amazing for many reasons, especially for an indie developer that couldn’t possibly develop that tech within a reasonable time frame.

Now I know that it would be awesome if Nvidia just released their gameworks source code to everyone and let AMD optimize/use it as they like but you have to understand that it is THEIR technology… I think it’s fair that they want to make money out of it by making it more exclusive and optimized for their hardware… I mean you can’t just expect them to gift their technology just like that… we can’t expect every company to be as open and awesome as Epic you know! (although I have the feeling this will eventually happen with Nvidia somehow).

Also if a game developer want’s to favor Nvidia by using their technology even if that means AMD users get potentially lower performance… it’s entirely their decision! Besides I don’t see AMD trying to offer anything like that… with the exception of mantle but that’s just a performance boost and comes with it’s own set of disadvantages (almost no engine supports it currently etc.).

I don’t know about you but I would actually rather have a game that has graphical advancements but runs a bit slower in AMD cards than have no graphical advancements at all.

And lastly… from all the benchmarks that I have seen (take watch dogs for example) AMD isn’t even that far behind in performance compared to Nvidia in the first place… so yeah… I don’t even know what to say :confused: they are exaggerating a little bit.

I am sorry, but there is a reason that AMD has the market-share it has in the enthusiast/professional GPU space and the CPU space. AMD’s market-share in the GPU space is growing now because it has a product lineup worth buying.

You really need to research the past to understand the present especially in terms of ATI before they were acquired by AMD vs NVIDIA. ATI had superior cards (in some ways) in the enthusiast market, albeit with some pretty shoddy driver implementations and NVIDIA was the go to company in the professional space. (they still are)

When you bring up, AMD vs. NVIDIA or AMD vs. Intel, you simply cannot put Nvidia in the big, bad evil and closed off category without placing the blame squarely at the feet of AMD. Do yourself a favor and research AMD’s Hammer CPU architecture and how AMD rested on its laurels for three years, which led to them losing the war in the CPU space.

AMD needed to buy ATI in 2006, not just to branch off into the GPU space - but to survive as a company.

Skip over GameWorks as a package and just discuss PhysX. Why should NVIDIA be forced to cross license the technology they rightfully acquired by purchasing Ageia so that it can run on AMD cards? They are already generous enough to let it run CPU bound.

You know I find it very interesting that AMD took the approach it did with TressFX, and speaking as an armchair quarterback, I believe opening it up as they did was a bad business decision. They should have kept it to themselves and continued to grow it from a real-time hair simulator to an all around PhysX alternative. They have done some really cool things with 2.0, but where is it in the wild?

Hey, you know what where is Mantle by the way? Has anyone besides select development houses seen and worked with it? No because it’s behind a Beta/NDA wall. So, how long does AMD think that they have before the DirectX 12 API goes from vaporware to reality? That’s not even a question though anymore because we know that AMD is working closely with Microsoft in terms of Mantle->DirectX 12. Since DX12 benefits the entire GPU space and not just AMD - how exactly is this a good business decision for AMD? Both current generation consoles are built around an AMD APU architecture, and you can bet Sony is closely watching what is going on with the MS AMD relationship.

It may seem off in regards to your post, but I think too many people doubt Microsoft in terms of their willingness to see the Xbox One through to the bitter end and AMD has zero choice - but to work extremely closely with Microsoft, as consoles aside, AMD doesn’t survive on the desktop space without them. So, if Mantle->DX12 is almost a straight shot as AMD would have us believe and NVIDIA reaps the benefits of DX12 just by being there - without being between a rock and a hard place in the console space - who stands to gain the most exactly?

So, when I read publications that speak on NVIDIA’s proprietary technology being a direct threat to a competitor and how that’s, “Not Fair!” I am amazed by the lack of discussion that AMD is in the position that it is because of a continuation of bad business decisions.

Personally, as someone who does a lot of scientific computing. I think Nvidia develops all these odd-ball GPGPU APIs/SDKS/etc simply because they have to stay competitive.

AMD vs Nvidia GPGPU performance isn’t even a question, the AMD architecture is nearly an order of magnitude better at performance. But everyone only cares about things like Gameworks, CUDA, etc. Nvidia knows that there is a massive lack of GPU and Parallel system designers/programmers. So their competitive edge is making the tools so easy to use that people not in that space can even get their feet wet.

OpenCL(or mantle?)+AMD is a PITA, but it’s certainly is where the performance is, again, speaking of GPGPU. If Nvidia did not have excellent tooling, they wouldn’t even be competitive in this space.

Sorry for bringing this thread back up . But it’s quite important, as the performance difference between the two HW vendors can be quite substantial (~35%) even with GPUs that are otherwise comparable performance wise. With Unreal Tournament in development, it’d be good if there wasn’t some huge artificial performance difference due to what are basically proprietary steps taken by Nvidia., i.e., if things were more vendor neutral.

Of course, if UE4 supported Mantle the problem would be solved.

Have you read the rest of the thread?