How do you feel about NVIDIA Gameworks?

Glad to hear that.

Yeah, the UE4 only uses CPU PhysX, even though the entire physics system is running off of it. It’s impossible to avoid if you’re using the UE4, and honestly that’s not necessarily a bad thing. It’s really good tech, and has some pretty big advantages to the competing solutions.

As for the rest of Gameworks, yeah they’re performance heavy effects, but things like VXGI I think are where we’re eventually going to be headed. Even the Cryengine in 3.8 just added some pretty incredible voxel lighting, and when comparing the two on my computer they both get about the same performance in some scenes. It’s weird though, whatever Crytek is doing for theirs has a really light performance hit in the forest demo level, which you would think would be computationally harder, yet the Baron Haussman scene that Synce made has really bad performance for the size of the scene.

In any case though, yes I do believe the effects have their place, performance just needs to be addressed before they become standard. If they manage to make Gameworks get the kind of performance boost that we saw from PhysX 2 to PhysX 3, I can easily see them becoming commonplace, regardless of video card vendor.

As a gamer I never had issues with gameworks. The performance hits felt always acceptable to me. PCs are more powerful than Consoles, but they still ain’t supercomputers.

The likelihood of those people who said this, being mostly AMD fans who like cheap stuff and don’t understand that R&D and implementation cost tons of money and time, is very high though.

Simple thought: If Nvidia is so “evil”, than why doesn’t the white knight AMD come up with their own GPU accelerated implementations? Because AMD can’t. They don’t have the money and the smarts! That’s the whole reason why AMD stuff is a bit cheaper than Nvidia. Because Nvidia doesn’t just charge for their GPU hardware only. It’s also for good drivers and APIs. I would be willing to shoved out extra 200 gold coins for G-Sync, so I could run games in windowed fullscreen to ditch exclusive fullscreen for once and for all. Instead of buying that half-a$$ed thing called FreeSync that may never support windowed fullscreen ever.

Bottom line: And this is the part AMD fangirls don’t like to hear. AMD ain’t so great and Nvidia ain’t so overpriced either. Nvidia isn’t Razer (I’m a Razer user :D)! Even if all of Gameworks went cross-platform, so supported AMD GPUs as well. Most credit would go to Nvidia, since best AMD can do is to make their APIs open-source so some other !d!ot implements stuff for them.

Actually, because I just mentioned that Nvidia users pay a bit more than AMD users. I feel inclined to say that there shouldn’t be any AMD hardware support too soon for GameWorks. Because AMD fangirls didn’t contribute to it. If I didn’t know that a GPU/API monopoly didn’t have a dark side. I would also love to see AMD being annihilated by Nvidia. And with the upcoming features for GameWorks, like Flex, Flameworks, VXGI. Nvidia could put with their Pascall GPUs extreme pressure on AMD.

I do like to quote people! :smiley:

Many gamers don’t have enought money game budget and choose AMD cards, so i would make a game that can run on their hardware without using Gameworks.
Like Android and Apple mobiles, you can make games that runs on Apple only or make games for both systems.

I fear, depending on the game though, that it may not be that simple if one used Flex, VXGI and Flameworks for the same project. The Batman Arkham series did have quite a simple implantation of GameWorks. And that’s why it just could be turned off. Is was only a very simple decoration.

That’s why you need to make sure you have an alternative if those features need to be turned off. Like the hair in The Witcher 3.

Though I would imagine it’s kind of a pain to do the work to integrate features like that only to have to do it in another completely different way for systems that can’t handle the feature.

But Flex used on physics based games wouldn’t work it seems. Flex can do things the normal PhysX can’t do. Flex unifies all physics features under one hood so that they can work together. So a physics based puzzle game, or another game type that makes heavier use of Flex, couldn’t get an alternative implementation whatsoever. There isn’t even such an advanced other API to my knowledge as far as I know.

How depended is GameWorks on Nvidia hardware in the first place?

Does somebody use GameWorks for UE4 from the private Github? The most interesting modules are FlameWorks, VXGI, and Flex. Though it seems they are still in beta, but I’m talking about the non UE4 version on Nvidia’s website.

If your game depends on that feature then that’s something you have to take into consideration, it’s probably a bad choice to make your game depend on a high-end feature.

As far as I know, right now only VXGI, Flex, and HBAO are available for UE4. I’d like to see Flameworks though.

This is not high end but Nvidia only features. All AMD players won’t be able to play your game and you will reduce the market for your game bacause of your hardware choice instead of using a portable solution. The choice is only on the developper side.

I like it.
It fits perfectly within my targetgroup.
Mac only with NVIDIA GFX, the small rest could be ignored!

watch this, this is why NVIDIA sucks

They know AMD cards are very bad at tessellation so they put a lot of useless polygons like Crysis 3, makes AMD card look very bad on any benchmark have NVIDIA technology inside
If you guys don’t know there is a AMD hair physic technology called TressFX which is open-source and compatible on both cards
And also the NVIDIA G-Sync, you will have to buy G-Sync chip to make it work on NVIDIA cards while AMD just ask TV devs to add AMD Freesync to their firmware and make it compatible on their TV

You call this **** video proof that Nvidia is conspiring with Crytek against AMD? How about considering that only Crytek may not like AMD? Or Crytek was just lazy?

And this chip can do better than FreeSync. It supports variable refresh rates even on very low FPS, and can run in borderless fullscreen, too.

We pay for Nvidia to deliver. And Nvidia delivers. If Nvidia is a ****, than AMD is a pu$$. Nvidia never asked AMD to do anything open source whatsoever. AMD did decide to do so because they know they tech is worth less, but yet they wanted to make it more popular, so they made it open source. That’s why you pay less. And not because AMD is your white knight, Robin Hood or your best friend ever.

True, Tesselation was first used in AMD 3D cards.
The best would be unified physics available on AMD and Nvidia 3D cards.

Yeah, as a game dev I would say the same. And PhysX runs on AMD computers, too, since it actually doesn’t even use the GPU. So PhysX can run on the CPU as well, and I guess Flex could do the same to some extend. But hell, I tried Flex, and some features did even kill my GTX 970!

Nvidia is currently promoting their Pascal chips for self-driving cars and some AI processing. It ain’t just a coincidence that GPUs are of more use than just graphics. This heavy paralleled processing can do more than pixels. What I’m saying is, that the current solution to make an API like PhysX, work on multiple platforms by offloading to the CPU, won’t work in future more advanced APIs! There is no way around hardware acceleration in the future it seems. And it’s not simple to make an API like FlameWorks, VXGI, or Flex either. So that one just could support both CUDA and OpenCL.

The problem is this. There is no alternative to Flex right now. I don’t know much about Havok, except that they got bought by Micro$oft from Intel. Though, actually Flex is still in Beta and not released yet really. Maybe by then Havok will come up with a unified particle based physics engine, too? But this would be just the physics, meaning the capabilities of FlameWorks and VXGI are still GPU locked. And only huge triple A studios could afford to develop their own versions of such APIs.