I try to update it regularly, and anybody who wants to be added can send me a tweet and/or follow. (If I basically see “vfx artist” or “tech artist” in your twitter bio I’ll generally add you to it.
First off, feel free to search for a youtube GDC series called “Killer Portfolio or Portfolio Killer”. It’s a panel of pros (I have done almost ten of them myself, go watch me ramble) but it talks about all kinds of art portfolios including vfx.
The basic idea for a portfolio though is:
Take as many things as you think you need, halve them, then delete 2. You don’t need tons of work in a junior portfolio to make a good impression and your portfolio is only as strong as your weakest effect. I have seen juniors get hired with four amazing fx in their portfolio, none of them weak.
Show your fx in context. FX on a middle grey background is incredibly common, and unfortunately does your work a disservice. Directors need to see fx work in context so they can see it functioning on dark, light, midtone, from the correct camera angle, and so on. This proves you can communicate gameplay intent in all conditions. It’s also good to express your metrics you were designing against in your portfolio and how you met them. For example: This is a grenade explosion, radius 384 units of damage, knockdown from 384 to 512. It needs to work in all contexts and even reacts to surface types differently if detonated in air, or on water, or snow. Here is the damage radius and how I showed that, here is the knockdown radius. That kind of thing. Invent a gameplay problem, and then solve it using fx.
Performance is your superpower to getting hired. If you can show that you have made extreme considerations for the holy trinity of beauty/timing, gameplay relevance, and performance, you get the job. Show overdraw views, show material optimizations, use tricks in niagara to share emitters or renderers, and so on.
Did I mention, have fewer things in there than you think and make them all equally awesome?
The 3D liquid templates include a “Secondary” emitter which adds representations for spray, foam and bubbles. You can dig into this implementation to see how it is currently implemented.
The underlying process involves using a ‘Grid3D Collection Reader’ to make the simulation data available from the primary simulation to your particle emitter.
Global budgeting is a tricky one. We’ve had some internal discussions about using simcaches as visual stand-ins or swapping to round-robining updates via timeslicing, but we haven’t devoted much time to it yet. Often I’ve seen perf-focused teams discuss having a hard budget and everything being balanced relative to that budget. We haven’t gone there yet and given the variety and complexity of the fx that go into games, it feels really hard to solve this without creating a defect report factory. For animations it makes sense, as you have a relatively well understood cost per skeleton/animation graph. If there are specific ideas that people have, I’d love feedback on it. That’s a long-winded way of saying, no not done and dusted, but we’re not quite confident where to push next.
It came up in an internal thread today. We want to get the experimental VM revamp shipped before we tackle it though. Some of the spawning logic around datachannels could really use it. We did some contortions within the DataInterface logic to allow you to filter the incoming data that would ideally be totally scriptable with for loops.
Fire is elusive indeed. The problem is many fold but: What does fire even mean?
Realistic or not? Huge and rolling in the distance, black inky smoke, blotting out the sky, or torches lining a hallway? Is this statically placed or can I pick up something burning and light something else? Should it propagate? Is it stylized, anime inspired Calcifer from Howl’s Moving Castle, or Balrog? Is it parented to a dynamic skeleton and 3d volumetrically simmed like a final fantasy boss fight, or is this on mobile and needs to have zero overdraw, no memory impact, and functionally free?
See what I did there? I completely frustrated you by questioning the entire premise of your question. As Goethe said at the end of Faust, Das Ewig-Weibliche zei…
Ah, nevermind. Good luck on your quest for enlightenment. Maybe try a flipbook.
I know effect types would carry most of this already, but I’d personally like to see a lot of the options available in said effect type as Niagara scripts. (mainly because I still suck at making them)
iow, easily selectable scripts that allow me to tweak X value on Y distance from the camera.
or reduce X value when Y amount of the same emitters(systems) are being spawned.
Additional things like
“Only use this emitter if the VFX quality setting (in-game) is set to epic”
or “change value X depends on Y quality/platform its being used on”
or “Disable X module when Quality setting/platform is lower than Y”
I’d personally like way more control over those things per each setting.
If… that makes sense.
We have that already in maybe a slightly different form. You can write local variables to the parameter map pretty easily with a Set node and pull them out later with a Get node. If you take a look at the modules Wyeth writes, they look quite linear with a lot of internal locals to keep it tidy and clean.
I know the Fortnite FX team does a lot of work like this already outside of Effect types. Using the renderer / emitter scalability settings in the scalability mode of the system will get you pretty far in the “only do this for VFX quality” range. We don’t have good controls yet for disabling whole modules, but it is something we could add for sure. “Engine.QualityLevel” as an index into an array for a scratchpad might give you some tools, but yeah ideally it’s more integrated into the tool than that.
Thank you everyone for participating in today’s AUA, and a huge thank you to our team for joining us to chat!
This chat is going to be closed but remain up so that all the information is available whenever you need it.