Interesting thing I found out while writing the post: quantum computing people call the states with negative entries 'magic states'. Because they contain the magic! I.e., potential for speedup over classical computing.
-
Show this thread
-
You can learn more from this post by
@earltcampbell : https://earltcampbell.com/research/magic-states/ … (but not from me as I still need to read up on it)1 reply 0 retweets 3 likesShow this thread -
I drew a dodgy picture of the geometry of magic states in the Bloch sphere. The magic ones are outside the octahedron. The further you are from it, the more magic they are.pic.twitter.com/vz4nw273wL
2 replies 1 retweet 1 likeShow this thread -
Even the most magic states are nowhere near as negative as the -½ of the toy example. Why not? Well, there are some constraints.
1 reply 0 retweets 0 likesShow this thread -
The interesting one is to do with information. For the toy example, we got definite Y/N answers to all 3 questions. This time the rule is you can only get max half the information. In a weird sense of 'information' I'll get to soon.
1 reply 0 retweets 0 likesShow this thread -
There's a very influential toy model that also has this 'half the information' property: https://en.wikipedia.org/wiki/Spekkens_toy_model … Very simple model that reproduces some, but not all, features of quantum theory. It uses just the states on the six corners of the octahedron. So no magic.
1 reply 0 retweets 0 likesShow this thread -
Now, there's a fascinating paper by van Enk that builds on the Spekkens toy model, by extending the 'half the information' property to more states: https://arxiv.org/abs/0705.2742 To do this, you need a good definition of information.
1 reply 0 retweets 2 likesShow this thread -
Most obvious choice is the usual one from information theory, linked to the Shannon entropy: https://en.wikipedia.org/wiki/Entropy_(information_theory) … Turns out this isn't the right choice to reproduce QM, though. Instead you want a weirder entropy.
1 reply 0 retweets 0 likesShow this thread -
Shannon is only one of a whole family of entropies, the Rényi entropies: https://en.wikipedia.org/wiki/R%C3%A9nyi_entropy … Wrote some rough notes here that may or may not make sense http://keerlu.github.io/2018/07/16/renyi-entropy.html … Anyway the one we want is called the collision entropy.
1 reply 0 retweets 1 likeShow this thread -
Mathematically this one isn't too bad. The constraint turns out to be that the sum of the squares of the four numbers in the boxes can't be more than half. Conceptually... why the hell is it this one?? I don't know.
1 reply 0 retweets 0 likesShow this thread
Apparently this has been independently advocated as a measure of information by Brukner and Zeilinger: https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.83.3354 … Unfortunately I don't really follow the argument there, so I'm none the wiser.
-
-
If you were expecting this to end with a satisfying conclusion... um, no, I'm still confused and have lots of questions. Some of which are here! So let me know if you have any helpful insights or references.pic.twitter.com/pmnWixChAG
1 reply 0 retweets 4 likesShow this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.