Pure blueprint function changes signature when overridden to nonpure

Subclass override behavior for pure blueprint functions is a bit non-obvious, and possibly not quite as intended:

  1. Create a blueprint class. Pick any parent you like, I used Actor since it was under my mouse.
  2. Create a pure function with no inputs, and one output (whatever type you like)
  3. Compile it
  4. Create another blueprint, this time set the first one as the parent.
  5. In the event graph, place a call to the pure function. Note that it does not require execution flow as an input.
  6. Right click that function under “Overridable Functions” in the “My Blueprint” window and hit implement
  7. Delete the call to the function from the event graph (or don’t, you can leave it around, but that might break things)
  8. Place a new call to that function in the event graph. Note that there are two autocomplete options for it now, and they both yield a blueprint node that’s visually identical. Note also that the produced node requires execution input and a target object on which that function should be called.

Is it possible to override a pure function that simply computes some data? Or, if the problem here’s on the calling end, is it no longer possible to make the sort of call being made on a nonvirtual pure function on a virtual one?

In my case I’m using it for a glorified property and so I can work around it, but I can see this being a bit painful for folks who use arity-zero pure functions as a sort of computed property. In any case, I’ve tagged this one both bug report and feature request, since I have a feeling the answer to the “Is this a bug” question is “Working as intended.”

EDIT: A quick note – I realize this one is basically my ignorance about K2 implementation details, so my apologies if it’s just a wrongheaded approach or question.