Emit particles over UMG button?

So in our game I have a UMG menu that appears with buttons. Whenever the player touches a button I would like to emit some particles for feedback. How can this be achieved? I know UMG is supposed to render on top of all 3D but I was hoping there was a way to do this if the particle material was set up for UI as well. If this can work I need to figure out the location of the UMG button that was touched in 3D space so I can emit it there (unless there is a way to attach it to a UMG component).

If this is not possible how can I achieve a workable solution to do this?

I can see many other uses for this as well. For instance, any time the player achieves anything after a level it is typical to show a score screen. Particles would also be necessary on top of a high score or new level unlock.

Thank you for your time,

  • Jeff

Well you could do a new widget and spawn it at the click position. This widget would have a flipbook material. I dont see any other way right now

The only other way is if your FX artist renders some sequence, makes it loopable and you spawn it on screen or composite it in the “button”.

isnt this just the same?

I guess so yea.

I’ve tried doing everything I could with the built in Particle System in order to get it to render on top of UMG elements but I’ve not been able to figure it out thus far. Is this possible?

If it is not possible I was trying to create a widget that can dynamically create Images and update them in real time. However, I’m running into an issue here. I’m able to create a variable in the widget but there is no way to set it to the “Image” type. I suppose I could just drag a few items in the designer (up to 20) and do it that way but I find that pretty nasty…

What else can I try at this point besides the “flipbook” method?

Ive heard of a method where you actually render a camera to a material BUT you wont be able to get a transparent effekt that way wich makes is pretty pointless.
(The functions you are missing for an image component would be “Brush vom Texture”, “Brush from Material”.

Long story short? C++.

make a local space particle, Have it spawn on the player screen (what we call “draw on clip plane” or “Draw near”) have it offset to the location you want and do the math for offseting depending on fov etc, Then make sure it’s only visible by that local player. And make sure it’s sorting is rendered above all.

if you don’t manage will check on my end see what I can achive and let you know if it’s possible with an emitter.

Thank you for your reply. I understand how to make a “local space” particle system. However, I’m not sure how to make it “draw on clip plane” or “draw near”. I’ll look for that. Also, I’m not sure where I can set it to sort above all (even UMG) =)

To give a quick update. I built a new dynamic particle system in UMG that creates images dynamically at a 2d space location and updates them over time. Apparently when you bind a material to a UMG image it doesn’t like it if the material is set to blend (modulate). It will work with the default blend though. Also, the last part I’m stuck on w/UMG is determining the x/y location the player touched. Sure there are “click” events for buttons but you cannot get the absolute x/y position of a button since it can be slotting in many different types of slots =).

I also tried to bind to the Touched global event on the level blueprint which is great… However the event is NOT fired off if you touch a UMG button =(

You have a main CanvasPanel make a MouseMove bound to it

from the Geoemtry & Mouse you can retrieve a mouse position in the UMG Viewport Space

there is a camera module with a camera offset module, Try using that.

I wish this were true. I can only see MouseEnter and MouseLeave off of a canvas panel…
I know how to get the mouse location off of the hud which is great but the x/y hud values do represent a 1/1 match between the hud and the canvas panel size…
Any other ideas??

Let me know if you can get any of your particles to render over a UMG control. This doesn’t seem to help.

The way we did it was to render the UMG widget as a 3D component-based widget (new in 4.6) and spawn particles on top of that. That’s the only way I know of right now without getting into the source code.

If they do not render at all then it means that you cannot do it through pfx. But like Jared said you should be able to spawn particles “over” umg as emitters on your screen and desired location with a lot of tweaking.

Well, what i was talking about was doing it in 3D, the widget actually existing in world space near the camera and spawning a (scaled) particle system physically on the component. Doesn’t require a lot of tweaking, but it is poor on drawcalls.

Your system sounds much more performant, but would require some real working knowledge, yes?

You can implement the mouse move in the graph view

I believe so yes due to having to tweak and scale, Have never really tried but it would of been easier todo as an fx artist.
You could ask or hire someone to make a particle system for you but that will need you to spawn it and make sure it appears.

Hi, I’m probably way to late for this conversation, but recently I also wanted to render particles directly into my UI. Since I didn’t find any good way of doing that, I’ve decided to write my own plugin for Niagara, that allows you to render Niagara sprite and ribbon CPU particles directly as UMG widget. If anyone’s interested you can find it for free here along with some examples. I hope some of you will find it useful.