Concurrency and read and write / set operations

  • How does set operation work under the hood?
  • How does reading a stored property work?

While the second questions sounds trivial, it really isn’t. I struggle to wrap my head around on the implicit structured concurrency model verse uses.

Imagine two async tasks references the same mutable storage.

Both execute the following operations:

  • read array from store into a local copy
  • randomly modify that copy
  • set / write back the modification into the store
  • read the value from the store

All those operations are seemingly synchronous, but are there really?
What if both tasks write at the same time? Which one wins is undefined, and expected behavior. However, what about the follow up read operation? Naively you’d expect the value we read from the store to equal the modified version, but is that really valid assumption?

Hence the original questions. Could set be another potential suspension (like) point in code and therefore the follow up read operation must except the reentrancy phenomenon?!

Verse is single-threaded - the concurrency model is exactly the same as JavaScript’s Promises (the same with reading/writing variables)

Do you have a reference to some docs that state that verse is single threaded? I keep thinking the same as I feel I read that somewhere but I fail to find any official documentation on that.

I don’t have any links since I’ve only seen minor references to it across the forums, but think about this:

race:
    block:
        Print("First")
        Sleep(Time)
    block:
        Print("Second")
        Sleep(Time)
    block:
        Print("Third")
        Sleep(Time)

The expressions in the race block run sequentially, and the next expression is only ran once the previous expression has been suspended (by Sleep in these examples) - it’s exactly the same with sync, which runs its expressions “at the same time” but only running each expression after the previous one has completed, top-to-bottom.

spawn is the same concept - it’s not creating a new thread and running the spawned function at the same time as the code following the spawn, but rather running it on the main thread up until it hits its first suspending expression, then allowing the scope that called the spawn to continue (like calling an async JavaScript function without await).

If Verse wasn’t single threaded, using expressions like race or spawn would produce very different results because of race conditions, especially when coupled with mutation.

2 Likes

Being in single threaded environment is extremely taxing. Right now the simulation runs at 30Hz, but if it were to increase to 60Hz then we will have even less time budget to finish certain computation per update cycle. I don‘t understand how the roadmap envisions us to build AAA games / simulations in a few years with such a constraint. Sometimes you have to perform some computational heavy and long running task, which may not be interrupted through suspension. In the current environment it would imply that all other work will be blocked as soon as we start with such computation.

Let‘s assume that verse is not in a single threaded environment. That would make my point from the original post clear that the 4th operation of reading the variable in both tasks could result in a value that is different from what that task just tried to assigned it to, as the other task could have overridden it. In a multi-threaded world, Verse‘ shared mutable states would become unsafe. Sure there are transactions that would do the locking for us, but that would still not prevent us from race conditions.

The problem I have with all this, is that I seek for official clarification so I can safely adapt my expected behavior for Verse‘ concurrency model.

Yeah, you can run into the maximum run time for the event loop pretty easily if you’re looping through a large multidimensional array and calling its extension methods like Find or RemoveElement (which blocks everything else, including any Fortnite actions like moving or shooting if it takes longer than a tick). And yes, that would be the case if Verse was multi-threaded, which is why it’s pretty clear that it’s not.

Tim Sweeney said in the Lambda Days presentation that his future vision for Verse at its massive scale is for it to simulate a single-threaded environment even when interacting with other people’s code, so I think it’s safe to say that it being single-threaded is a firm design decision.

1 Like

Let‘s hope it‘s just an initial limitation and we‘ll get some kind of explicit but still safe opt-out. Otherwise I just can‘t see how this can succeed as it would otherwise fully rely on the computation speed of the simulations backend. If it‘s not going to be upgraded consistently, there will be no way to improve the performance of a simulation. Furthermore there doesn‘t seem to be that much we can do on our end to optimize certain use cases, as some will still only work better in a true parallel environment.

Take a look at any mobile applications. We should only perform lightweight or UI tasks on the main thread / queue, and off load other tasks to background threads, otherwise this will likely cause hitches and bad UX. Games aren‘t that much different.

It mentions this in the March 2023 State of Unreal presentation which indicates to me it is only an initial limitation.

4 Likes

Nice find!

It was also mentioned in one of talks that for now verse is using blueprints vm under the hood (which is single threaded)