This is up and running as a C++ library. Currently transactional variable reads and writes are ~50x slower than ordinary variables. 40% the overhead is transaction bookkeeping, 40% is concurrent garbage collection bookkeeping, 20% is platform atomics. Many optimizations to go.https://twitter.com/TimSweeneyEpic/status/1210260682605764611 …
-
Show this thread
-
This amount of overhead may be acceptable, as you'd only use transactional variables for shared globally-visible state (like properties of game objects) and not locals. If performance is dominated by low-level operations like collision detection, this may be negligible.
2 replies 0 retweets 27 likesShow this thread -
For comparison, UE3 UnrealScript and pre-il2cpp Unity Mono bytecode interpretation were ~30x slower than native.
4 replies 1 retweet 29 likesShow this thread -
One great side-effect of transactions is that you can automatically undo variable writes upon failure, for example so you can write something like "if(move a bunch of actors) ... else ...", and if any operations fail, then all of their effects are undone.
1 reply 0 retweets 32 likesShow this thread -
I've had to re-learn my code optimization intuition from the late 90's. C++ compiler optimization is magic now, and Skylake can reliably issue 4-6 instructions per clock. But control flow misprediction has become wildly expensive.
3 replies 4 retweets 61 likesShow this thread -
Anyway, there are two competing theories on how we'll unlock higher performance through parallelism. One is the data oriented design approach, asking programmers to rewrite gameplay code as highly parallel fragments of algorithms that pipe inputs and outputs among stages.
4 replies 0 retweets 61 likesShow this thread -
The other is transactions, hoping we can just write gameplay code using "var<int> Health;" instead of "int health;", write code to minimize unnecessary contention for shared state, and have the engine and API magically sort out concurrency for us.
12 replies 1 retweet 92 likesShow this thread -
Replying to @TimSweeneyEpic
Wouldn't this lead to problems where order of operations is important? ie +health from pickup and -health from damage. Would the concurrent operations library ensure consistency, or just leave it indeterminate?
1 reply 0 retweets 2 likes -
Replying to @CVbMG1U1
Transaction ensure that each gameplay object update is atomic, consistent, durable, and isolated. Transactions are just a way to have a lot of threads collaboratively contribute to outcomes that are indistinguishable from a single thread running each updates sequentially.
1 reply 0 retweets 3 likes
In the simplest possible terms, we can run two updates in parallel but instead of writing to global memory, we track what we're going to write. At the end, if there were any read-write conflicts, we commit one to memory and revert the other.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.