so exciting!
No updates on this yet. I promise weāre still pulling for it though.
I am still excited about this! Will it ever happen? I am asking because Zak is one of the best people to do it. Itās like Feng Zhu in art. Really!
He usually heading straight to the point. He describing a practical examples. He have notes ((in his notebook) so you can see that he is prepared.) And he is very practical.
Please, please, please!!! His videos and preparation gave me more experience than a book or your documentationā¦
And Iāve been developer for 13 year. You should disable his holidays and order him another crunchtime because this video. What can be more satisfying than serve people? :p:p:p:D
@N_ck-cz Yes! Weāre going to be doing it this week.
Thanks. Christmas is earlier!
Iāll ask my question during the live stream but just in case;
As a designer who is learning Blueprints for protoyping and trying to improve my programming skills, does Zak have any tips on how he goes about planning his blueprints before touching the computer? Sort of like sketching or writing pseudo-code?
Excited, wish there was a way to go meta with the blueprint system from within a game. How would one go about adding a live coding feature within a game?
Any comments on meta programming through the complex forest that is the Unreal Engine API would be helpful.
Example: If we wanted to collect meta data from the userās editor session and feed that into a learning database (done), then how would could we send that input back into new sessions of unreal engine editor?
We are wanting to recreate that DOTA learning system, but for observing game developers and acting as a sort of Jarvis system. The Tensorflow plugin by Getnamo rocks, but we simply lack the CS background for undertaking this, so can anyone help a bunch of scientists Macgrubber something like this?
Our idea is to observe collaboration sessions between users, where userās each posses editor abilities and can build together in one space. Originally we thought to include the VREditor Actor simply into the Cardinal Menu Project, but that didnāt quite work out.
Where would you folks start, as a question of best practices for this sort of work/coding?
You could play with this making a live coding function, but simply allowing text input and then taking that text and using it as another nodeās input. You use this all the time when you make interfacesā¦you just have to take that concept further. It should be possible to build a type of modular interface that takes blueprint nodes and uses them to create widgetās that essentially function as the same thing, then make the changes propagate from the widget to the real blueprint node.
Making a shared editor though, thatās a doozy⦠Since Unreal still executes every blueprint sequentially, just like a line of code, you can run into trouble when you have to worry about replication. Youād be better off brainstorming on exactly WHAT concepts you want to have at play or play with, and go from there. The original Garryās Mod is a perfect example. If you havenāt heard of it, I would check it out.
Zak - for real? I thought this stream was just a Myth. weāll see if youāre real or not, Stream⦠weāll see!
ha haāit seems surreal that an internet event is canceled due to weather.
It is a reminder for all of us (well, at least me) that even on the internet, it all happens at a real location with real people!
Whoever handles the Twitter feed didnāt get the memo, it looks like they posted the announcement an hour after you posted the cancellation. Headās will roll, Amandaās gonna kick butt.
Hey Everyone!
Super sorry we ran out of time on todayās livestream. I really tried to get everything fit in, but it just couldnāt happen.
But⦠GOOD NEWS, EVERYONE!*
After the stream I went back to my desk and re-recorded the WHOLE thing as an impromptu training video. This way you donāt have to jump back and forth or wait. You can watch it all right now (or as soon as YouTube is done processing it).
This is on my private YT channel, since itās kind of a non-pro video. Itās not studio quality, itās just me with a headset.
Ask questions here and Iāll answer as best I can. Maybe those of you who know the answer already can help me out!
Love to everyone! I hope this is helpful!
*Most of you probably hear Prof. Farnsworth when you read that⦠I hear Prof. Putricide. Make of that what you will.
Zak, you are the best! Thank you very much!
Thanks, I enjoyed part one and am looking forward to more!
Thanks also for pointing out the ātrickā of disabling all BPās ticking by default, (because even if you are not using an Event Tick, they still āTickā)!
In a small project itās obviously easy to fix this by unticking the āStart with Tick Enabledā option in each new blueprint (class); but could someone please clarify which command you need to use and what INI file you can do this in? This seems the most sensible way to go i.e. turning it off for everything unless you explicitly turn it on.
[I made a quick attempt, ahem! I added ābCanBlueprintsTickByDefault=Falseā into āEngineSettings.iniā but that didnāt seem to do anything. Iāve also tried turning it off in āProject Settingsā -> āGeneral Settingsā -> āCan Blueprints Tick by Defaultā, but opening a new Blueprint Class still seems to show that Tick is enabled by default.]
With regard to tips and tricks like the above - are these collated anywhere? In my case Iād never have found this out if I hadnāt watched this particular video.
Regarding various ways of implementing hover and interaction without Tick.
I had recently gone about this using a combination spring-arm/collision sphere parented by the player camera working with a similar BPI implementation, which has worked out well and kept me out of Tick instead using the āon hit & begin/end overlapā events of my collision component.
āI donāt have to use any timers which seems beneficial.āāāāāā
Iām just wondering if there are glaring performance or other problems that jump out to veterans with my proposed method? Perhaps spring arms are more intensive than a trace draw?
Iām definitely not a veteran and I have no idea if any of the following is correct, but Iād imagine that using a collision sphere isnāt much different from using a sphere trace in tick. The collision of your sphere has to be checked somehow, after all. Spring arms use collision and all kinds of fancy extra features as well, so theyāre probably even more expensive.
Based on what Iāve been told in the past, line traces are generally the cheapest method of probing the environment by far.
I guess the question seems kind of silly, of course a trace is going to be super low resource intensive, my thinking was that Epic probably optimized the heck out of the components like collision spheres and spring-arms, and using them would piggy back off their components rather than trace every frame or .2 seconds or do some math.
@franktech Iāve only done three iterations or so, but my setup working from something like the 3rd person pawn example is likeā¦
Capsule
-Mesh
āCameraSpringArm
āPawnCamera
----DetectionSpringArm (arm length = max detection distance)
-----DetectionCollisionSphere (collision channel tuned for my interactable actors)
In that last collision sphere I use the On Component Begin/End Overlap events, check if my other actor has a tag I want (even though it wonāt collide with non-interactables thanks to collision channels, I double check), check if this actor is already the subject of our focus, and if so I keep/toggle on my post process effect or change crosshair colors or do whatever magic. If the player is still and mouse not moving, and an object were to pass by, the ādetection springarmā shortens (itās collision is with visible objects), the ādetection collision sphereā gets overlapped if the object is in itās collision channel and highlights or doesnāt accordingly.
Iām still learning, maybe itās totally goofy, I was just wondering and this is about best practices, and Iām all about avoiding the Noid-err I mean Tick. FunFact You can disable scene components individual Ticks too