SMeshWidget - Hardware Instanced Slate Meshes Thread

I’m trying to work out how I can utilize the new 4.11 SMeshWidget to draw Static Meshes as part of my User Interface. All I actually want to draw at first is a very simple plane, but I need to transform the vertices such that the plane is flat with my radar widget, which currently looks like the screenshot below. All of the blips on the radar are currently UImage widgets, which are added to the Radar Widget dynamically during play.

3b28ac8cb04d28b69329ab218b802515427676f8.jpeg

Now if an object is far enough away that it goes off of the radar’s grid (which is a circular radius around the player) - it get’s clamped to the edge, at which point I want the square blip to turn into an arrow, pointing to it’s location. The arrow also needs to be distorted such that it’s displayed as if on the same plane as the radar grid itself (which is around 30/40 degree incline). Since the radar also rotates with the player rotation, the widgets also need to move around the edge of the radar and keep pointing to their respective objects.

Since there could be hundreds if not thousands of these blips on the radar at once, it seems like a perfect use-case for the new SMeshWidget which appears to be able to render lots of things like this very quickly. However it seems as as powerful as this new feature is I still haven’t really figured out how to use it. For a quick test-case, I wrapped SMeshWidget in a UWidget and placed it in my HUD with UMG. I then gave it a mesh to use and a very simple UI-material, and this is the result:

f4bffc4ed2aa8e99e3abaa2be71002fcb2405946.jpeg

If anybody is interested, this is the relatively simple code that makes this possible:

Custom Mesh UWidget



#include "BZGame_BlipWidget.generated.h"

class USlateVectorArtData;
class SMeshWidget;

UCLASS()
class BZGAME_API UBZGame_BlipWidget : public UWidget
{
	GENERATED_BODY()

public:
	UBZGame_BlipWidget(const FObjectInitializer& ObjectInitializer);

	UPROPERTY(EditAnywhere, BlueprintReadOnly, Category = "Mesh")
	USlateVectorArtData* MeshData;

	virtual void SynchronizeProperties() override;
	virtual void ReleaseSlateResources(bool bReleaseChildren) override;

#if WITH_EDITOR
	// Begin UWidget Interface
	virtual const FSlateBrush* GetEditorIcon() override;
	virtual const FText GetPaletteCategory() override;
	virtual void OnCreationFromPalette() override;
	// End UWidget Interface
#endif
	
protected:
	// Native Slate Widget
	TSharedPtr<SMeshWidget> MyMeshWidget;

	// UWidget Interface
	virtual TSharedRef<SWidget> RebuildWidget() override;
};




// Fill out your copyright notice in the Description page of Project Settings.

#include "BZGame.h"
#include "Hud/Widgets/Game/BZGame_BlipWidget.h"
#include "Runtime/UMG/Public/Slate/SMeshWidget.h"
#include "Runtime/UMG/Public/UMGStyle.h"

#define LOCTEXT_NAMESPACE "UMG"

UBZGame_BlipWidget::UBZGame_BlipWidget(const FObjectInitializer& ObjectInitializer)
{
	//SMeshWidget::FArguments SlateDefaults;
	
}


TSharedRef<SWidget> UBZGame_BlipWidget::RebuildWidget()
{
	MyMeshWidget = SNew(SMeshWidget);
	return MyMeshWidget.ToSharedRef();
}

void UBZGame_BlipWidget::SynchronizeProperties()
{
	Super::SynchronizeProperties();

	if (MyMeshWidget.IsValid() && MeshData != nullptr)
	{
		const uint32 NewMeshIndex = MyMeshWidget->AddMesh(*MeshData);
	}
}

void UBZGame_BlipWidget::ReleaseSlateResources(bool bReleaseChildren)
{
	Super::ReleaseSlateResources(bReleaseChildren);
}

#if WITH_EDITOR
const FSlateBrush* UBZGame_BlipWidget::GetEditorIcon()
{
	return FUMGStyle::Get().GetBrush("Widget.ProgressBar");
}

const FText UBZGame_BlipWidget::GetPaletteCategory()
{
	return LOCTEXT("Common", "Common");
}

void UBZGame_BlipWidget::OnCreationFromPalette()
{

}
#endif

#undef LOCTEXT_NAMESPACE


And here it is placed in UMG. Notice that despite the widgets position at the center of the screen, the mesh is drawing on the top-left still (you can see the edge of it poking out).

d166970dc0453cfb345334f61939e34f99c56a04.jpeg

Now even though my custom mesh widget is placed directly central in my custom UMG widget, the mesh always draws at the top-left of the screen, and in fact it seems to be based not on the viewport size, but on the size of the entire monitor. According to Nick on slack, I need to pass information about the mesh position in via it’s UV’s, and I sense that this is what the new ‘Screen Position’ input on the material is for.

However, nothing I try seems to work. I’ve used a bunch of default nodes that I think might be the right ones to mirror what I’m already seeing (such as View Size, Screen Position, a vector 2D etc.) - but each one makes it disappear, so I’m obviously not doing something right. I know that these are being used in Paragon for the widgets that are above the minion heads, so I’d really like to see how that was done. I’ve been trying to figure this out for a few days but it’s going way over my head.

Additionally, if I still can’t use this to transform my widgets as if they are in 3D space - I’m open to other suggestions :wink: Eventually, I’d like to draw a static mesh ‘disc’ underneath the existing wire-frame if possible.

Awesome!
That’s exactly what we need for our inventory system.

DennG

Just to quote Nick on this, bear in mind that this is NOT a way to draw 3D meshes in the viewport.

Hey TheJamsh,

I just noticed you are also trying to use the new widget.

I will be giving this a go also, thanks for the info thus far.

1 Like

Will re-post this here from the other thread, since this is where I want people to put what they figure out :smiley:

I put together the simplest possible example of something practical you could do with it, https://drive.google.com/open?id=0B3jI1Q523JpSYktpMVlvUFVmTWs

There’s a lot to sift through to really understand everything it’s doing. Partly understanding how the ugs SlatePixelShader is sending the data to the material is worth looking at, so you understand what uv channels will contain what data.

Again - not easy to understand, hard to use, can do cool stuff with it; maybe if others take a look they can help explain everything in this thread :slight_smile:

The jist is that, the SMeshWidget takes some 2D mesh data, and uses instanced mesh rendering to execute 1 draw call to render all of something. You pack a small amount of data into the instanced mesh buffer, so each instance gets 1 FVector4 worth of data for each draw on the GPU. You then pack whatever you need in there, anything you can’t pack, you’ll need to pass down an index, and extract extra information from textures you’ve packed the data into.

&d=1459537901

1 Like

Okay so here’s an update (finally).

Got back to working on my radar stuff, but decided to work in the example project nick gave first. I finally have this circular mesh drawing as slate geometry!

Source code (though it’s very simple). You can see the shader setup and the mesh in the video. What I want to do next is have better control over the position of the mesh in UMG, using the bounding rectangle ideally to scale and place it.



// Copyright 1998-2016 Epic Games, Inc. All Rights Reserved.

#include "MeshWidgetExample.h"
#include "Slate/SMeshWidget.h"
#include "Slate/SlateVectorArtInstanceData.h"

#include "ParticleWidget.h"

#include "MeshWidgetExampleCharacter.h"

DECLARE_STATS_GROUP(TEXT("MeshWidget"), STATGROUP_MeshWidget, STATCAT_Advanced);
DECLARE_CYCLE_STAT(TEXT("Particle Update"), STAT_ParticleUpdate, STATGROUP_MeshWidget);

class SParticleMeshWidget : public SMeshWidget
{
public:
	SLATE_BEGIN_ARGS(SParticleMeshWidget) { }
	SLATE_END_ARGS()

public:
	void Construct(const FArguments& Args, UParticleWidget& InThis)
	{
		This = &InThis;
	}

	virtual int32 OnPaint(const FPaintArgs& Args, const FGeometry& AllottedGeometry, const FSlateRect& MyClippingRect, FSlateWindowElementList& OutDrawElements, int32 LayerId, const FWidgetStyle& InWidgetStyle, bool bParentEnabled) const override
	{
		SCOPE_CYCLE_COUNTER(STAT_ParticleUpdate);

		const float Scale = AllottedGeometry.Scale;

		// Trail
		if ( This->TrailMeshId != -1 )
		{
			FVector2D TrailOriginWindowSpace = AllottedGeometry.LocalToAbsolute(AllottedGeometry.GetLocalSize() * 0.5f);

			TSharedPtr<FSlateInstanceBufferUpdate> PerInstaceUpdate = BeginPerInstanceBufferUpdateConst(This->TrailMeshId);
			PerInstaceUpdate->GetData().Empty();

			// Draw Radar Mesh
			FSlateVectorArtInstanceData RadarData;
			RadarData.SetPosition(TrailOriginWindowSpace * Scale);
			RadarData.SetScale(Scale);
			RadarData.SetBaseAddress(AMeshWidgetExampleCharacter::RotationAngle); // Set to Pawn Rotation

			// Add Radar to Data
			PerInstaceUpdate->GetData().Add(RadarData.GetData());
			FSlateInstanceBufferUpdate::CommitUpdate(PerInstaceUpdate);
		}

		return SMeshWidget::OnPaint(Args, AllottedGeometry, MyClippingRect, OutDrawElements, LayerId, InWidgetStyle, bParentEnabled);
	}

public:
	UParticleWidget* This;
};

UParticleWidget::UParticleWidget()
	: TrailMeshId(-1)
{
}

void UParticleWidget::SynchronizeProperties()
{
	Super::SynchronizeProperties();

	if ( TrailMeshAsset )
	{
		TrailMeshId = MyMesh->AddMesh(*TrailMeshAsset);
		MyMesh->EnableInstancing(TrailMeshId, 1);
	}
}

void UParticleWidget::ReleaseSlateResources(bool bReleaseChildren)
{
	Super::ReleaseSlateResources(bReleaseChildren);

	MyMesh.Reset();
}

TSharedRef<SWidget> UParticleWidget::RebuildWidget()
{
	MyMesh = SNew(SParticleMeshWidget, *this);
	return MyMesh.ToSharedRef();
}



1 Like

Alright so, I’m almost done with this now. Really happy with how it’s turned out, all I want to do at this stage is change the draw-order of the items so that I can get the blips to render on top of the grid lines. Not sure if that’s going to be possible without splitting the widget up though, which I’d really rather avoid. Anyway here’s the video, recommend full screen 1080 to see it working.

So I’m doing this using only three instanced meshes and inside one SMeshWidget, but to be honest I could probably have squeezed everything in with two meshes. The base compass is a mesh, drawn in a similar way to how I’ve done it above, but I’ve now got it to use the rectangle as the draw location, which prevents any strange issues with scaling or position on screen. The easiest way to do it is use the ClippingRectangle.

The trickiest part was the blips. I already have a pretty robust spatial hash system and an object manager for getting the list of objects, so that part is already there. Every Game Object with a scanner tells the object manager to ‘ping’ objects in it’s range on a timer, and the object manager determines which ones are visible or in range / on the correct team for that object.

The coolest part (IMO), is that I’ve only split the draws into either edge-arrows, or regular blips (aka in-range items). Three floats is what I need to determine colour, opacity, and either Rotation or ‘Blip Type’ (square or circle) - but these all have to be packed into the BaseAddress, so you only get a certain amount of precision (but it’s more than enough for me). The material then unpacks this data and uses it to drive the parameters.

Here’s the way the packing is done:



	FORCEINLINE float PackFloats_3(const float X, const float Y, const float Z) const
	{
		const uint8 XInt = X * 255.0f;
		const uint8 YInt = Y * 255.0f;
		const uint8 ZInt = Z * 255.0f;

		const uint32 PackedInt = (XInt << 16) | (YInt << 8) | ZInt;
		return (float)(((double)PackedInt) / ((double)(1 << 24)));
	}


And then how it’s unpacked in the material:

Blip and Arrow materials. Byte1 is used to drive a Colour LUT to determine the colour, Byte 2 is used to determine opacity, and Byte 3 is the ‘Data’ byte.


Phew… yeah so that’s what I did this weekend!

2 Likes

Dear @TheJamsh,

This is an incredible sharing! Thank you so much for sharing your research!

I love your 3D mesh rendering of the landscape in your video!

Thanks also for taking the time to put up all those pictures!

Great work!

Rama

No probs @Rama :slight_smile: Hope it’s of use to people!

Just a question on your stats segment of the UI, how are you achieving the snipped off angled corner of the stats boxes? looks really good with the gradient opacity.

Thanks, they’re a relatively simple material using a mask texture I created in Photoshop. I control the colour / size through code :slight_smile:

1 Like

Oh wow dude that’s awsome, thanks for sharing the set ups, much appreciated! :slight_smile:

Just one simple quick question, how would I get “AMeshWidgetExampleCharacter::RotationAngle”, it is not in the provided script.

This stuff looks very promising.

In that example you posted, there are 5 Particle widgets, which generate a bunch of small rotating squares as seen on that screenshot.

How many drawcalls are generated in this example?

Is it 1 drawcall per ParticleWidget? ie: 5 drawcalls in this example.

Or 1 Draw Call per unique mesh per widget? ie: 1 drawcall since all 5 different Particle Widgets use the same unique mesh.

1 Draw Call per ParticleWidget.

Thanks Nick, that makes sense. I was asking since I couldn’t see the number of drawcalls changing when adding/removing ParticleWidgets. Even when removing all particle widgets, the number of drawcalls stayed the same. I used the stat scenerendering command. Is there any other command I can use to find out the drawcalls generated by the ParticleWidget?

stat slate.

Oh I didn’t know about that command. It’s actually quite useful.

Thanks Nick!

I’ll try to understand smth about SMeshWidget (clues):

  1. SMeshWidget consist of 2D meshes with materials and parameters, used for creating vector geometry images with texturing and transforming?
  2. Each mesh may have 1 UI Material?
  3. SMeshWidget also may transfer custom “user parameters” to UI Material, via 1 InstanceBuffer per Mesh?, which is basically an array of FVector4 values.
  4. If we want to use “user parameters” we have to extract it from InstanceBuffer inside of UI Material, using its functionality (material functions)?
  5. “User parameters” may be updated and packed (overrided) inside InstanceBuffer. For instance we may change position and rotation every OnPaint event and pack (write) it inside InstanceBuffer?
  6. We also may pack “User parameters” to textures used by UI Material? if InstanceBuffer is not enaugh.
  7. Each SMeshWidget may have multiple meshes (as layers?).
  8. Each mesh may be instanced, which means it may have many copies with different “user parameters” (for example color, position, rotation, scale…)?

I was just looking at MeshWidgetExample for 1 hour, so i am not exactly sure what i understand. I’ll try to figure out more later. Would be nice if someone correct (or admit) my clues.

I’ve finally tackled this issue trying to make my enemy overhead healthbars. It turns out the issue @<a href=“https://forums.unrealengine.com/member.php?u=155” target=“_blank”>TheJamsh</a> had regarding the position of the widget caused me a great deal of confusion. Slate seems to apply some weird voodoo magic to properly scale and positions its widgets and getting the **** thing above the head of my characters was an exercise in frustration.

The issue ended up being in the code from @<a href=“https://forums.unrealengine.com/member.php?u=2522” target=“_blank”>Nick Darnell</a>, as he’s getting the mid-point of the widget and then moving stuff around in his example. I noticed that I could properly get a good screen projection when using UMG and canvas panel slots, so I looked at that code. Turns out that that positions the elements from the top-left corner. So my final positioning code ended up looking like this (I won’t paste the entire widget as the code is pretty similar to what’s been posted here already):


FVector2D TopLeft = AllottedGeometry.LocalToAbsolute(FVector2D::ZeroVector);


...


HPData.SetPosition(TopLeft + (Scale * HPBar.Position));

The HPBar.Position is calculated like so:


//#TODO: Need a way to pass in a reference point (like a socket above the head) and use that instead of the actor location
FVector CurrentLocation = ReferencedActor->GetActorLocation() + FVector(0.f, 0.f, ParentWidget->VerticalHealthbarOffset);

UWidgetLayoutLibrary::ProjectWorldLocationToWidgetPosition(ParentWidget->GetOwningPlayer(), CurrentLocation, Position);

The rest is pretty much identical to what’s been posted already. But I suppose I should mention the thing I had to do myself to get it working. I wanted to have a “depletion lerp” on my health bars, i.e. you hit the enemy, their health drops to the new amount, but the health that was there now turns red and slowly depletes down to the current value. Thus, aside from the Scale and Position I needed to pass in 2 more values that I had to pack into 1 float. This turned out to be pretty simple, and I didn’t have to use any byte operations, I simply used the whole part of the float for one value and the fraction for the other (as both are normalized).


//I update the depleted percentage every frame as I need to animate it going down to the current percentage.
float DepletedPercentage = FMath::Lerp(OldPercentage, NewPercentage, Alpha);
PercentagePacked = FMath::TruncToFloat(DepletedPercentage * 1000.f) + NewPercentage; //DepletedPercentage is normalized too, so I multiply it by 1000 to give it some resolution (As I'll only be using that truncated part in the shader).

Lastly, in my shader I do the following:

Lastly, the UnpackHealth function looks like this:

The subtraction and clamp at the bottom is to handle the case when the health is full, i.e. 1.0. In that case there won’t be a fraction, but the whole part will be 1 larger than it should be.

Hope this helps someone.

Edit:

Here’s how it ended up looking. Sorry for the poor FPS.

2 Likes