Training Stream - Extending the Editor - Jan. 20th, 2015

You can, but it’s not an ideal interface right now… Unfortunately the texture editor uses the basic menu extension system and doesn’t pass you the texture being edited, so you’d have to either apply your action to all textures currently being editing or add a dropdown list of textures in your button (check out FAssetEditorManager).

To register your extension, it’d look something like:


		struct Local
		{
			static void AddToolbarCommands(FToolBarBuilder& ToolbarBuilder)
			{
				ToolbarBuilder.AddToolBarButton(FUIAction(Some action));
			}
		};

		TSharedRef<FExtender> ToolbarExtender(new FExtender());
		ToolbarExtender->AddToolBarExtension(
			"TextureMipAndExposure",
			EExtensionHook::After,
			CommandList.ToSharedRef(),
			FToolBarExtensionDelegate::CreateStatic(&Local::AddToolbarCommands));
		TextureEditorModule.GetToolBarExtensibilityManager()->AddExtender(ToolbarExtender);

Cheers,

If your type is reflected as a UCLASS() or a USTRUCT() then you should be able to expose it to Blueprints just by using Blueprintable / BlueprintType tags. If you have some external system handle or similar (e.g., a smart pointer to a third party class you can’t make reflected) then you should probably wrap it in an opaque handle struct that can be passed around to Blueprints but only actually operated on in C++ code.

In general, you can mix reflected and non-reflected (plain C++) classes freely in your game code, but anything that needs to be accessible to Blueprints, network replication, etc… should be in a reflected class.

Cheers,

As mentioned on the stream, those eyeballs are only for toggling the visibility flag, and are not designed to enable/disable the actor wholly. This aspect of scene outliner isn’t extensible, but you could probably stick the actors you want to disable into a ‘Disabled’ sublevel and unload that level before hitting Play.

Cheers,

Adding a plugin involves a bit of boilerplate but nothing too tricky. I just received a pull request that adds a New Plugin wizard, so if it checks out OK, that should be a new feature coming in 4.8 :slight_smile:

At a high level, a plugin has the following structure:


Plugins\PluginName\PluginName.uplugin (json plugin manifest/descriptor)
Plugins\PluginName\Module1\Module1.Build.cs
Plugins\PluginName\Module1\Private\Module1PrivatePCH.h
Plugins\PluginName\Module1\Private\Module1Module.cpp
Plugins\PluginName\Module1\Public\Module1Module.h (or IModule1Module.h or not present if it presents no public interface)

You can have more than one module (they’re listed, including loading phase and type of module, in the .uplugin), and you can have as many source files in the module as you want, but you’ll at least have the PCH header and the module CPP.

RE: Activating a plugin. Plugins specify whether they’re activated by default or not; most of the time you’ll say yes. This can be overridden in a .uproject file in either direction, and for ones that aren’t enabled by default you can also enable them via the Plugin browser in the editor.

Cheers,

[QUOTE=Jared ;208537]

  1. Any chance you can show us how to edit geometry through code? A simple tute on how to move a single vertex?

I don’t think there’s a high-level API wrapper for it, but I’d start by following into FMeshUtilities::BuildStaticMesh or manipulate the source model and call UStaticMesh::Build (IIRC this is where Simplygon hooks in). If you’re doing 100% procedural stuff check out UCustomMeshComponent instead.

Sure, just make the C++ class visible in your Public or Classes folder, and then include it in your project. You will need to depend on the plugin in your game Build.cs as well (this might happen automatically for Game plugins, I don’t remember for sure).

Cheers,

You can get the set of selected assets by doing:


	TArray<FAssetData> Objects;

	FContentBrowserModule& ContentBrowserModule = FModuleManager::Get().LoadModuleChecked<FContentBrowserModule>(TEXT("ContentBrowser"));
	ContentBrowserModule.Get().GetSelectedAssets(Objects);

Note: FAssetData works even for unloaded assets, to convert to a UObject (which might involve a blocking load) call FAssetData::GetAsset().

How you display a window depends on where you want to trigger your command from. If it’s something you only want to expose on a few kinds of assets, I’d recommend extending the context menu for the asset (check out how Paper2D does this for texture assets in Engine\Plugins\2D\Paper2D\Source\Paper2DEditor\Private\ContentBrowserExtensions\ContentBrowserExtensions.cpp). From there you can pop up a details panel on a custom object or make a stand-alone window and populate it with arbitrary UI. In general, I’d recommend sticking with details panels / property editors wherever possible, as it ensures a consistent look&feel, and avoids you having to deal with searching/filtering, scrolling boilerplate, etc…

Cheers,

What you’re interested in is more a runtime thing (which can be used in Editor UI too if desired). Check out SRichTextBlock, which lets you define different styles and alternate between them in a text string. There are a bunch of examples in STestSuite.cpp:




	static FText GetAroundTheWorldIn80Days_Rainbow()
	{
		return FText::FromString(
			TEXT("<Rainbow.Text.Red>\"</><Rainbow.Text.Orange>I</> <Rainbow.Text.Yellow>know</> <Rainbow.Text.Green>it;</> <Rainbow.Text.Blue>I</> <Rainbow.Text.Red>don't</> <Rainbow.Text.Orange>blame</> <Rainbow.Text.Yellow>you.</>  <Rainbow.Text.Green>We</> <Rainbow.Text.Blue>start</> <Rainbow.Text.Red>for</> <Rainbow.Text.Orange>Dover</> <Rainbow.Text.Yellow>and</> <Rainbow.Text.Green>Calais</> <Rainbow.Text.Blue>in</> <Rainbow.Text.Red>ten</> <Rainbow.Text.Orange>minutes.</>\"")
			TEXT("

")
			TEXT("<Rainbow.Text.Yellow>A</> <Rainbow.Text.Green>puzzled</> <Rainbow.Text.Blue>grin</> <Rainbow.Text.Red>overspread</> <Rainbow.Text.Orange>Passepartout's</> <Rainbow.Text.Yellow>round</> <Rainbow.Text.Green>face;</> <Rainbow.Text.Blue>clearly</> <Rainbow.Text.Red>he</> <Rainbow.Text.Orange>had</> <Rainbow.Text.Yellow>not</> <Rainbow.Text.Green>comprehended</> <Rainbow.Text.Blue>his</> <Rainbow.Text.Red>master.</>")
			TEXT("

")


...

SNew( SRichTextBlock)
.Text( RichTextHelper::GetAroundTheWorldIn80Days_Rainbow() )
.TextStyle( FTestStyle::Get(), "Rainbow.Text" )
.DecoratorStyleSet( &FTestStyle::Get() )
.WrapTextAt( 600 )


Cheers,

The video and sample project will be uploaded later, but for now are some of the top take homes:

  • When in doubt, use the Widget Reflector! This tool will lead you around almost any existing editor UI and give you abundant examples to learn from (about the only thing they don’t work well on is multibox controls, since those are all built systematically; However, searching for the text on a widget or tooltip is usually enough to get you close).
  • If you’re wondering how to interact with a system, check the FooModule.h file. There will frequently be a condensed public interface there, such as all the useful stuff in FPropertyEditorModule.
  • The Paper2D editor module is another good place to look for examples, as it is a major subsystem that was built out as a plugin, forcing it to use only the public engine/editor interfaces for extension.
  • Think about undo when you add your feature. It’s far easier to add and test Undo right when you implement your feature, rather than 3 months later when you just lost some work since you couldn’t Undo :slight_smile: It’s pretty easy, just add a FScopedTransaction around your manipulation work, and call Modify() on any objects you are going to modify before you modify them.
  • If an interface feels convoluted or hard to work with, please let us know! We want to make it easy and fun to extend the editor, so knowing bits that are hard to work with is very valuable (or even better, submit a pull request to improve it if you can).

Cheers,

Honestly, extending the editor is complicate and will drive many peoples back.

Most peoples when comes to a new UI expect to add a toolbar button some how like that:

GetToolbar(name)->Addbutton(where, icon, callback);

and probable same way with a menu item, without needing to know in depth details about the internal framework.

Awesome! Really great stream btw, I finally got around to watching. Thanks for you time.

Hope I’m not too late to the party. I’ve got a question regarding input values in the editor. I’ve got a plugin I’ve been working on thats nearly complete which adds support for all the basic integral types within blueprints. Everything functions perfectly except for specifying either default values for the added types in the property details window or for “make literal of” nodes because the editor only uses the default types. I’ve got a workaround at the moment where I make a string literal and parse it to the desired type in custom conversion nodes. How would I got about accepting input for these types properly? Would I need to make a custom details window that features a text box that validates input suited to my plugins needs? Searching through the source, I noticed the IDetailCustomization interface but wasn’t sure how to use it.

Also, the editor doesn’t automatically drop a conversion node when connection to inputs of other types, and I have to insert the custom conversion nodes manually. Is there a way I can get the editor to intuitively know which of my custom conversion nodes to drop in automatically? I also noticed the usage of EdGraphSchemas within the source and have been wondering is this what I’m looking for to fix this issue?

Any advice is greatly appreciated.

I’m actually kind of amazed you got this to work without modifying anything in the engine. There’s no extensibility for scalar types in the BP compiler (e.g., stuff like FKismetCompilerUtilities::CreatePropertyOnScope has a hard-coded tree of mappings from FEdGraphPinType to UProperty, ditto for stuff like EmitTermExpr in the VM backend). How are you handling that?

You can make new entry method widgets on nodes by creating SGraphPin subclasses and registering them (FEdGraphUtilities::RegisterVisualPinFactory), but you won’t be able to emit literal values for new types to the bytecode without changing the VM in Core (and the VM backend for the compiler), it just doesn’t have opcodes for things like 64 bit literals. For editing them in defaults, I don’t actually know the answer there. It would need to be handled in the property editor module, and I don’t know about exposing new UProperty things there.

The casting issue is similar, there is a specific list of casts that are suitable for a given source->destination type mapping declared in the schema (UEdGraphSchema_K2::SearchForAutocastFunction). There isn’t a way right now to add new ones, though you could add a registration system to the BlueprintGraph module, call it from that function, and submit that as a PR.

Cheers,

I’m not ready to give away the “recipe to my secret sauce” publicly just yet :stuck_out_tongue: but I could discuss it a bit in private if you’re curious. I just want to have the plugin a little more polished before I release it. I’m very excited to get this out there for people though. I’ve pretty much put my personal project on hold for the moment to make this a priority because I think this will be a great tool for the community. Currently, all the types are fully usable, fully reflected and support replication. There’s some other goodies in there too, such as exposing some hashing functionality since unsigned ints are usable within it, and even added some double-precision utilities to compliment doubles (basic though since portability is important and a lot of hardware doesn’t support accelerated double-precision very well anyway). I actually had a thread up giving some details on the progress but there didn’t seem to be much interest so i pulled it lol. Figured I’ll just wait till it’s ready to ship and be able to showcase the full extent of what the plugin can do.

Thank you for the suggestions, I’ll see what I can do with it. Definitely gives me some areas to poke around at and see what can be done.

Drop me a pm if you wanna discuss it though, I keep an eye on the forums a fair bit :slight_smile:

Any chance we can get the example project you used?

Any update on the package yet? I’d like to use and write about Extending the Editor for my graduation project at school. :slight_smile:

I spent a few hours digging through the Engine code and pausing the video at key frames and finally got something working. I already have a game plugin for doing some custom stuff so I will make reference to that in a few places. 's a high level overview of what I had to do:

  • Create a new TCommands class
  • Add a TSharedPtr<FUICommandList> member to my plugin module
  • Write a bunch of boilerplate code to set up the delegate so the button calls the function
  • Hook up the extender to add the button
  • And of course, the function that I wanted to call
  • Also, had to add dependencies to the build script for my plugin

's some details…I am copying from Visual Studio so let me know if I make some mistake:

–Create TCommands class–


class FNCommands : public TCommands<FNCommands>
{
public:
	FNCommands()
		: TCommands<FNCommands>(
			TEXT("FNCommands"),
			NSLOCTEXT("Context","FNCommands", "FN Commands"),
			NAME_None,
			FEditorStyle::GetStyleSetName()
			)
	{
	}

	// TCommand<> interface
	void RegisterCommands()
	{
		UI_COMMAND(TestCommand, "Test Command", "Test Command", EUserInterfaceActionType::Button, FInputGesture());
	}
	// End of TCommand<> interface

public:
	TSharedPtr<FUICommandInfo> TestCommand;
};

–Add a FUICommandList property to the game plugin–


class FNPlugin : public IModuleInterface
{
	// other stuff
public:
	void TestFunction();

private:
	TSharedPtr<FUICommandList> CommandList;

}

–Boilerplatecode added in StartupModule to hook up the extender–


void FNPlugin::StartupModule()
{
	FNCommands::Register();

	CommandList = MakeShareable(new FUICommandList);
	CommandList->MapAction(
		FNCommands::Get().TestCommand,
		FExecuteAction::CreateRaw(this, &FNPlugin::TestFunction));

	struct Local
	{
		static void AddToolbarCommands(FToolBarBuilder& ToolbarBuilder)
		{
			ToolbarBuilder.AddToolBarButton(FNCommands::Get().TestCommand);
		}
	};

	FLevelEditorModule& LevelEditorModule = FModuleManager::LoadModuleChecked<FLevelEditorModule>("LevelEditor");
	TSharedRef<FExtender> ToolbarExtender(new FExtender());
	ToolbarExtender->AddToolBarExtension(
		"Game",
		EExtensionHook::After,
		CommandList,
		FToolBarExtensionDelegate::CreateStatic(&Local::AddToolbarCommands));
	LevelEditorModule.GetToolBarExtensibilityManager()->AddExtender(ToolbarExtender);
}


–The function you want to call–


void FNPlugin::TestFunction()
{
	UE_LOG( LogTemp, Log, TEXT("Button pressed") );
}

–Add dependencies–


			PublicDependencyModuleNames.AddRange(
				new string]
				{
					"Core",
					"CoreUObject",
					"Slate",
					"UnrealEd",
					"EditorStyle",
					// ... add other public dependencies that you statically link with  ...
				}
				);


Anyone know if they released the sample project for this yet?

Not yet. There was an issue with releasing a C++ project via the Learn tab that was waiting on the new launcher release to be resolved, I’ll ask around and see where it’s at now.

Cheers,

Hey. I doubt you remember me but I was working over in QA with Justin for a while. I talked to you a couple times about tappy chicken on Galaxy S5. Anyway, I building an editor for Unreal, and I have the basic parts done, but I want to have a 3D view and a 2D view. 2D would be for looking at a map, and 3D to look at a globe. I have the 2D mode, basically used the sprite layout, but what would be the best way to build the 3D, I could go the route that the BP Editor used and makes Apps, I tried to just implement the a 3D View from the static mesh, but ended up with the whole scene rather than just my object, which I could fix, but didn’t know if that was the best way to go or not. Which one would be the best? I am making a pretty cool new system for the engine, I will be email some guys about it once I get a little farther along, but I think it will be pretty cool and add a pretty good element to most gameplay types.

Heya,

If it’s for the editor, I’d recommend using an editor viewport client subclass for anything 3d. They are basically a ‘pocket world’, so you can spawn components in there, etc… Without knowing more I can’t say if it makes more sense to use the same viewport for 2d and 3d with a toggle switch (e.g., the Perspective/Front/Side toggle) or to have two separate viewport clients / SViewport widgets.

RE: Apps, do you mean the modes bar? We actually moved towards removing that for the basic BP editor in 4.7 (it still exists for the UMG editor and Persona, which were too complicated to merge). I’d try to find a single screen layout that works if it’s possible, but sometimes things are just too complex (e.g., Persona).

Cheers,