I completely agree with Scott that this particular "Grassmannization" isn't equivalent to what supersymmetry is doing in physics. Supersymmetry is a constraint that picks a subset of theories – ordinary theories with ordinary bosonic and fermionic fields that are just arranged (and whose interactions are arranged) so that there is an extra Grassmann-odd symmetry. Because supersymmetric theories are a subset of more general theories, of course that all the general inequalities that hold for the more general theories hold for supersymmetric theories, too. And there are many new inequalities and conditions that hold for supersymmetric theories – but not fewer constraints.
In supersymmetric theories, what becomes Grassmann numbers are never probability amplitudes. Only particular observables are fermionic operators – operator counterparts of Grassmann-number-valued quantities in classical physics. These fermionic operators only have nonzero matrix elements between Grassmann-odd states and Grassmann-even states; for the same reason why bosonic operators only have nonzero matrix elements between states of the same grading. One may introduce a grading on the Hilbert space but the amplitudes are still complex commuting $c$-numbers.
There's a simple reason why probability amplitudes can't be Grassmann numbers. To get physical commuting quantities out of Grassmann numbers, one always has to integrate. That's why the Grassmann variables may be integration variables integrated over in Feynman's path integral; but that's also why they have to be set to zero if we're doing classical physics. There aren't any particular nonzero values of Grassmann numbers. On the other hand, probability amplitudes don't have to be integrated; their absolute values should be just squared to obtain the probabilities (or their densities such as differential cross sections).
So if their construction is consistent at all, it's just a mathematical analogy of superspaces at a different level – amplitudes themselves are considered "superfields" even though in genuine quantum physics, amplitudes are always complex numbers. That's why the inequalities can't be considered analogous to Bell-like inequalities and can't be applied to real physics. In particular, once again, Tsirelson's bound can't be violated by theories just because they're supersymmetric (in the conventional sense, just like the MSSM or type IIB string theory) because it may be derived for any quantum theory, whether it is supersymmetric or not, and supersymmetric theories are just a submanifold of more general theories for which the inequality holds.
I would point out that it wouldn't be the first time when Michael Duff and collaborators would be giving wrong interpretations to various objects related to quantum computation. Some formulae for the entropy of black holes mathematically resemble formulae for entangled qubits etc. But the interpretation is completely different. In particular, the actual information carried by a black hole is $A/4G$ nats i.e. the black holes roughly parameterize an $\exp(A/4G)$-dimensional space of microstates. That's very different (by one exponentiation) from what is needed for the quantum-information interpretation of these formulae in which the charges themselves play the role of the number of microstates.
So I think that at least Michael Duff has been sloppy when it came to the interpretation of these objects which was the source of his misleading comments about the "black hole entropy formulae emulating tasks in quantum computation". There may be mathematical similarities – I am particularly referring to the Cayley hyperdeterminant appearing both in quantum computing and black hole entropy formulae – but the black holes aren't really models of those quantum algorithms because their actual Hilbert space dimension is the exponential of what it should be for that interpretation and they're manipulating pretty much all the qubits at the same moment. The objects in the hyperdeterminant have completely different interpretations on the string theory and quantum computing side; there isn't any physical duality here, either.
This post imported from StackExchange Physics at 2014-07-24 15:47 (UCT), posted by SE-user Luboš Motl