Perhaps the most valuable result here is how hard it is to get this anywhere near right. CPU vendors MUST acknowledge and fix Spectre v1.https://twitter.com/chandlerc1024/status/977139103849316353 …
-
-
Replying to @RichFelker
One could have “speculation ways” throughout the “entire cache hierarchy” to prevent evictions so that rollbacks have no visible micro architectural side effects, but that seems like a hard problem. Would need speculative read, commit and rollback in the cache coherence protocol.
1 reply 0 retweets 0 likes -
Replying to @corkmork
Easy (proposed, needs check for completeness) solution is just adding a MSR bit to disable fetch during speculation and letting users who care about security flip it.
1 reply 0 retweets 0 likes -
Replying to @RichFelker
Ya I thought about that. As far as I can tell the first load is fine, it’s the second speculated load using the value from the first load that leaks data. They could have a bit in the issue window that the register value is the result of a speculated load, as barrier for 2nd load
1 reply 0 retweets 0 likes -
Replying to @corkmork
I think even the first is potentially dangerous. If a reg contains a (non-ptr) value that's private and you trick branch prediction to branch to somewhere it's used as an index or ptr, part of its value potentially leaks.
1 reply 0 retweets 0 likes -
Replying to @RichFelker
Possibly, but Spectre variant 1 uses a data dependent load to leak data i.e. two loads, the second load with a stride to cause a particular cache eviction. Still curious what the performance hit would be to disallow all speculative loads.
1 reply 0 retweets 0 likes
My claim (no data but strong intuition) is that the hit would be minimal on most loads people care about, where most loads (pardon the pun) are either already in l1 (no penalty) or at least in l2/l3 (minimal stall compared to going to dram).
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.