Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,354 answers , 22,792 comments
1,470 users with positive rep
820 active unimported users
More ...

  Grassmann variables and the Theory of fermions

+ 4 like - 0 dislike
21994 views

I am looking to understand exactly how and why grassmann variables are introduced into theory of fermions.

I've read various accounts of the subject from text books. While I understand what is written there, I still have a discomfort when dealing with grassmann variables. 

Most recently I've looked at this this article 

http://www.int.washington.edu/users/dbkaplan/571_14/Fermion_Path_Integration.pdf

It introduces grassmann variables as eigenvalues for coherent states of the creation and annihilation operators. 

It appears to me that if we introduce the creation and annihilation matrices in the usual sense 

 \(b^\dagger = \{ \begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array} \} \)

\(b = \{ \begin{array}{cc} 0 & 0 \\ 1 & 0 \end{array} \}\)

and then try to solve the equations for coherent states \(b|\psi> = \psi |\psi> \) for real \(\psi\), we run into the trouble of not being able to solve these equations.(Please correct if I'm wrong)

The only way out seems to be to introduce Grassmann variables as suggested in the Reference.

One of the question that I want to understand is Why are grassmann variables inevitable in physics? I also want to understand how to connect grassmann variables to experience. 

If anyone has any insights and references into grassmann variables beyond the standard text book material, I would very much appreciate it. 

asked Sep 24, 2016 in Theoretical Physics by Prathyush (705 points) [ revision history ]
edited Sep 25, 2016 by Prathyush
Most voted comments show all comments

Grassmann numbers are the classical limit of fermions. For example, in quantum mechanics the canonical commutation relation of fermions is $\left\{\phi^{a}(t,x),\phi^{j}(t,y)\right\}=O(\hbar)$. Taking the classical limit $\hbar\rightarrow 0$, we obtain anticommuting Grassmann numbers.

In mathematical language, the quantum fermions can be regarded as the Clifford algebra, while the Grassmann numbers form the Grassmann algebra. The former is the quantization of the latter. 

@XlaoyiJIng Thanks, I'll read the read the references you enclosed, If I have questions I'll come back. I was skeptical of Vladimir's post also, The paper I attached Relativistic dynamics of spin in strong extrernal fields. Clearly implies that the classical approximation to an electron is a spinning particle. 

@VladimirKalitvianski I said it is a spinning particle(which is a classical object) It has a magnetic dipole moment, the usual spin algebra which is intrinsically quantum mechanical.

@Vladimir If you encounter that problem again, you can simply click the "add comment" button once more, and your unsent draft will fill the space.

@dimension10, @dilaton: Thank you for your suggestions. I believe I clicked "Cancel" - my comment is absent even in my submission history.

Most recent comments show all comments

I'm a complete newbie to physics overflow, I am not sure I did this right: there should be a full answer now visible, at the very bottom. Hope it helps.

@Arnold Neumaier My brain refuses to work. I will reconsider your comment a few days later. Thanks a lot.

3 Answers

+ 4 like - 0 dislike

Perhaps this will help. From the Pauli exclusion principle, we know that fermion wave functions have to be "odd" or "anti-symmetric".  Thus, for example if \(\psi(x,y)\) describes two fermions, then one must have \(\psi(x,y)=-\psi(y,x)\) under the exchange of coordinates of two particles. Moving to quantum field theory, we want to be able to describe N-fermion states, and we also want to be able to create and destroy them: e.g. destroy a fermion moving in one direction, and create one moving in another direction.  Alternately, maybe destroy an electron-positron pair, and create a pair of photons. This forces the use of the concept of Fock space, and the use of creation/annhilation operators. The entire process of setting this up is called "second quantization". 

The issue is that the two operators you wrote down are misleadingly over-simplified: they are not enough to create an N-particle state; they are not enough to create a state where the N particles have distinct, different momenta, or maybe distinct, different positions.    To do it properly, you have to work with ladder operators that do this.  Now when you do so, and you create an N-fermion state, it must be completely anti-symmetric under the interchange of particle indexes: exchanging any pair must flip the sign on the wave function: this is the Pauli exclusion principle at work -- this is what it demands.

The \(b\) and \(b^\dagger\) operators you wrote are just enough to construct exactly one fermion in one specific momentum-state (or position-state, etc).  That is why they square to zero -- you can't have two fermions in the same state. But there is no clue, in the simplistic way that they are written, that the resulting particle states will be anti-symmetric. (Caution: the ladder operators themselves are not Grassmann variables. The Grassmann variables discussed here are to be used exclusively for the wave functions. What's more, the Grassman numbers themselves are not spinors! This is a common source of confusion, especially if you wish to get \(\overline{\psi}\psi\)as a non-vanishing quantity; see the end-note.).

The Grassmann variables are a book-keeping device that helps you keep track of the sign, during any calculations. Swap two of them, and the sign changes. You don't have to use them, but if you don't you will probably make more errors. The anti-commuting might seem strange, but anti-commutation is very common in differential geometry: two differential forms anti-commute. The correct way to think of Grassmann numbers (see wikipedia) is to pretend they are differential forms on some manifold, but then throw away/forget the manifold, and throw-away/forget that they are derivatives: all you have left is an abstract algebraic device that anti-commutes, and does nothing more.  Equivalently, you can pretend that they are actual fermion wave-functions, and then completely ignore/forget that they have a position, a momentum, a spin, or even that they are wave-functions at all -- and keep only the fact that they anti-commute.  That's it. End of story.

Well, almost end-of-story. Grassmann variables would be boring and almost useless if that was the end-of-story. From this point on, a vast number of interesting, surprising and outright amazing "accidents" happen, all involving Grassmann numbers in some way or another. Most immediately, in your case, they allow you to write exact expressions for Feynmann path integrals involving fermions as a determinant.  Now, determinants have all sorts of amazing properties of their own, involving operator algebras, the det=exp-trace-log relation, Poincare duality, K-theory, the Atiyah-Singer index theorem which relates fermions to topological twists of boson fields -- for example, the proton, at low energies, is accurately modeled by the "Skyrmion", which is a soliton in the pion field -- the Skyrmion gets the axial vector current of a proton just about right, it gets the proton radius just about right, etc.  Yet we know protons are made of quarks, and so we have to somehow "rotate" those quark fields "into pions", and accomplish that rotation in a consistent fashion.  This is just scratching the surface, though. There are too many neat directions one can go in: rotations lead to spin lead to Lie groups, lead to covering groups; the covering groups of rotations form a Postnikov tower, consisting of the rotation group, the spin group, the string group, the 5-brane group ... now, this is just/only "pure math", no physics involved -- just fiddling with Eilenberg-MacLane spaces. And yet the pure math leads directly and unavoidable to vocabulary like "string" and "brane" which are heavily used in theoretical physics. How can that possibly be? Hmmm. 

The point is that Grassmann numbers need to be viewed as a handy-dandy calculational device that neatly captures and balls up the Pauli exclusion principle. The Grassman numbers are just plain, ordinary elements of the exterior algebra, which is central in mathematics. You will see the exterior algebra over and over again, in both math and physics, so you better get very comfortable with it.  More generally, get comfortable with what an "algebra over a field" is -- its just a vector space, where you can multiply vectors. But algebras are pervasive, and Grassmann is just one example. There are many others, important to physics: Clifford algebras, Jordan algeras, Lie algebras, etc. If you say to yourself, "its just an algebra with weirdo multiplication" you can get through an awful lot of material.

Caution: The algebra generated by the ladder operators is not the Grassmann algebra, it is a Clifford algebra, and not just any Clifford algebra, but a Clifford algebra over the complex numbers, and specifically, a certain subspace of that, the spin algebra. The correct construction of this can be subtle and confusing. so here it is, copied from page 69 of Jurgen Jost's book "Riemannian Geometry and Geometric Analysis" : Let \(e_i\) be the ordinary real basis vectors of a real vector space \(V\). Construct a tensor algebra, and from that by taking the appropriate quotient, a Clifford algebra, so that \(e^2_i=-1\) and \(e_ie_j+e_je_i=0\) So far, this is just a real algebra (not complex). These wacky looking algebraic relations may look .. strange, but they are perfectly well-defined and arise purely as the result of taking a quotient on the tensor algebra. The tensor algebra seems like a beguilingly simple object; its not, it has subtlety. Likewise, so is quotienting. Anyway, the \(e_i\)  really are meant to be the basis vectors of a real vector space, here.  Next, suppose that \(V\) is even-dimensional and consider the vector space \(V\otimes\mathbb{C}\) and specifically a subspace \(W \) that is spanned by the vectors \(\eta_j =(e_{2j-1}-ie_{2j})/\sqrt{2}\) Taking the plain-old ordinary scalar product of vectors \(\langle,\rangle\) you can directly verify that \(\langle \eta_i,\eta_j\rangle=0\) and that \(\langle\eta_i,\overline{\eta_j}\rangle=\delta_{ij}\) where the overline denotes complex conjugation. This is exactly the algebra of the ladder operators that you wanted, and it was explicitly constructed as the subspace of a certain Clifford algebra. Technically, one writes \(V\otimes\mathbb{C}=W\oplus \overline{W}\)and the exterior algebra \(\Lambda W\) of \(W\) is the spinor space -- that is, it is \(\Lambda W\) that is the Grassmann algebra that you are working with, in physics. That is, the wave function of a single fermion (Weyl spinor) belongs in the (vector space!) \(W\) and that for n of them belong to \(\Lambda^n W\) The ladder operators are then certain endomorphisms in \(\Lambda W \) and \(\Lambda \overline W\) and, as any endomorphism would do, inherit some of the algebraic properties of the space that they act on. Dirac fermions are direct sums of one Weyl spinor, and one anti-spinor (four numbers are used to represent a Dirac spinor, not two, and zitterbewegung means that two of them behave as an anti-particle.). All this can be quite subtle, confusing, and subject to a whole mess of numbing, confusing details -- the Spin group, the Pin group, the spin structures and spin manifolds, as is readily explored in the Jost book.(Wikipedia does not quite have enough detail, here) If you have the time, plowing through these is a worthwhile exercise: it will turn you into the kind of student that the professors hunt down and respect.

answered Sep 27, 2016 by linas (85 points) [ revision history ]
edited Sep 29, 2016 by linas

Ha, I like the notion of Grassmann numbers as differential forms on some manifold. Nice interesting answer :-)

As I said, things get suggestively strange. There's this idea of "spin connections" that, very roughly, are both covariant derivatives and fermion-like things at the same time.  All these things interconnect in so many surprising ways, yet I know I cannot quite make out the "big picture"

There's even more weird stuff with spin: read, for example, about the "primon gas" here: Möbius function and ask yourself "what does number theory have to do with fermions"? Its really quite... something.

While the answer says something interesting about Grassmann numbers and what they are, it doesn't touch the classical limit, and how they connect to physical experience.

Moreover, your initial motivation is a bit misleading as the Fermionic creation and annihilation operators don't anticommute but give a Clifford algebra, not a Grassmann algera (which is the classical limit). 

Please improve the presentation.

The "classical limit" of anything quantum remains dicey and subtle and subject to much argumentation (and also to quite significant theoretical investigation). The poster  did ask about "connecting to experience" but I took that to mean "to experience in performing algebraic manipulations" rather than "to experimental physical experience". You addressed some aspects in your post.

Re: confusing anti-commutation of wave functions and ladder operators: wow, yes, good point. I recall being endlessly confused and wrong-footed about this, due to sloppiness in my thinking coupled to the light touch taken in textbooks and class. Its actually kind of hard to untangle these, what all with c-numbers and a-numbers and complex conjugation, etc. There's cognitive overload.

@linas Thanks for a neat and comprehensive answer. When you mentioned \(V \bigotimes C = W \bigoplus \bar{W}\) You mean to break up the complex space into 2 real subspaces right? Do elements in the  \(\wedge W\)  space some how form a representation for the clifford algebra? I've never seen this particular description of grassmanian numbers before it's quite interesting, I will definitely look into the book suggested,.

Hi @Prathyush  -- I gave an excessively abbreviated account. Each element of W was formed from a pair of vectors in V with coefficients \((1,-i)/\sqrt 2\) -- which means that you should think of such a pair as a single Weyl spinor. This is the intended interpretation. If V is 2n dimensional, you will get n different weyl spinors inside of W. So, n=1 is a good example: you get one spinor in W -- lets call it \(\psi\), and one conjugate spinor in W-bar -- lets call it \(\overline\psi\).   So far, these are just a pair of complex numbers in a two-dimensional vector space. Nothing magic, yet. Now, form the tensor algebra, form the quotient, and verify that \(\psi^2=0\) -- here's how.

So, by definition: \(\psi=(e_1+ie_2)\sqrt 2\)  and \(\overline\psi=(e_1-ie_2)\sqrt 2\)  Next, \(\psi^2\) is actually just short-hand for \(\psi\otimes\psi\)  which is an element of the tensor algebra TW of W. So: expand it out: \(\psi^2=\psi\otimes\psi = e_1\otimes e_1 + ie_1\otimes e_2+ie_2\otimes e_1 - e_2\otimes e_2 \)where I dropped the factor of a half cause I'm lazy.  Now, by construction of the Clifford algebra, as a quotient of the tensor space, we get that \(e_1\otimes e_1 = e_2\otimes e_2 = -1\) and that \(e_1\otimes e_2 = -e_2\otimes e_1\)  -- these two identities were forced to hold by taking the quotient. In other words, the equal sign here means "its in the same coset" or "same equivalent class" See wikipedia Quotient space (topology)  -- so you can now see that \(\psi^2\)  is in the same coset, or the same equivalence class as zero. In formulas, \(\psi^2=0\)  Likewise you can (now, easily) compute that \(\overline\psi\psi=\overline\psi\otimes\psi=1\)-- bingo we now have pairs of numbers, that, by construction, behave like Grassmann variables, they anti-commute -- and they also behave like spinors.  I did not demonstrate that they also behave correctly under rotations, but they do: the defining equation for the Clifford algebra is just a sphere (a circle, in two dimensions) and since spheres are rotationally symmetric ... In the above example, which is 2D, you can rotate by any phase, and get the same results. 

Hmm. I should be careful, when I say "rotations". -- "rotation of the internal phase/internal angle" would be more accurate.  The above constructs an anti-commuting Weyl spinor in zero-dimensional space-time -- at a single point.  To get four-dimensional space-time, you have to repeat the above construction once, for every point in space-time. Basically, you've built a vector space (of anti-commuting spinors) over space-time.   To use this vector space for anything practical, you will probably want to limit yourself to smooth, differentiable vector (spinor) fields.  You'll want to impose the Poincare algebra as well, so that the spinors transform "nicely" as you move from one point in space-time to another. You also want to define space-time rotations, as well, and make sure that you've attached the spinor fields correctly, sot that they transform correctly when you rotate spacetime: you want the internal rotations (spin) to match with space-time rotations (angular momentum). All of these are just consistency conditions you force onto the construction. You also probably want to construct two such spaces, take their direct sum, to get a Dirac spinor. (which, oh by the way, by construction, now it anti-commutes). 

Anyway -- its exactly the sphere symmetry of the Clifford algebra  -- the preservation of lengths and angles under "internal" rotations, that allows it to be be used to construct representations of rotations, and of spinors. As long as you glue them onto manifolds correctly, appropriately, they'll behave as you wish them to, and they'll be Grassmannian, as well. Its not really any more or less natural than anything else in physics -- arguably, its more natural than "plain" spinors, because the Pauli exclusion principle is now baked in, instead of added as a frosting on top. 

+ 3 like - 0 dislike

Grassmann variables  are needed in path integral formulations since one wants to treat Fermions and Bosons on the same footing.

They are needed for the discussion of supersymmetry, since the latter is based on this treatment on the same footing. They are also useful as technical supersymmetric tools in the treatment of non-supersymmetric quantum systems, e.g., Diustermaat-Heckmann theory.

They are unrelated to experience, since observables must be even operators.

Note that there are two ways of performing a classical limit:

(i) If one takes the limit $\hbar\to 0$ in the formal, unrenormalized quantum field equations (which don't make sense, except for the free field) one gets a Grassmann variable description, as @XIaoyiJing had remarked. This limit is useful since it is the basis for the Lagrangian approach to Fermions in quantum field theory. In this sense, classical Fermions are described by Grassmann variables. 

However, the Grassmann variables lack an observable interpretation. The reason is that nonzero odd variables square to zero, hence are never self-adjoint. But in order to talk about the probability density for observing an operator, one needs its spectral resolution, which doesn't exist for odd variables. Therefore only even variables can be observable in any meaningful sense. The Grassmann variables are therefore just a convenient tool to represent the semiclassical observables of interest, primarily the expectation and correlation functions of currents.

(ii) If one takes the limit $\hbar\to 0$ on the level of observables, no Grassmann variables ever appear. This is the true classical description. In this observable-based view, there are no classical Fermions with  a local description of spinor fields defined on space-time (or space, in the nonrelativistic case at fixed time) . However, there is still a classical notion of the observables corresponding to Fermions, given by a microlocal, kinetic description of Wigner matrix fields defined on an 8-dimensional (or 6-dimensional,  in the nonrelativistic case at fixed time) phase space. Thus the statement ''There are no classical fermions'' is wrong without the qualification made above.

The classical limit can be nicely studied in deformation quantization, which is the converse procedure of deforming the commutative product such that the commutator becomes $i\hbar$ times the Poisson bracket plus higher order terms. Thus to find out what the classical limit of a field theory is in observable terms, one can retrieve the correct phase space and Poisson bracket by looking at the commutator of two observables divided by $i\hbar$ and taking then the limit $\hbar\to0$. This produces a Poisson bracket on the observable algebra. In a general field theory, the observable algebra consists of all linear combinations of products of a Bosonic operator or a constant with an even number of Fermionic operators, and the classical limit indeed produces on this algebra a Poisson bracket with the correct classical rules, the Leibniz and Jacobi identities. In particular, the classical limit of the product $\bar\psi(x)\psi(y)$ is not zero but a bilocal field, which after a Wigner transform becomes a $4\times 4$ Wigner matrix field on an 8-dimensional phase space. Suitable references for the latter are (not for the operator version but for the 1-particle version resp. operator expectations):

Gérard, Patrick, et al. "Homogenization limits and Wigner transforms." Communications on Pure and Applied Mathematics 50.4 (1997): 323-379.

Calzetta, Esteban, and Bei-Lok Hu. "Nonequilibrium quantum fields: Closed-time-path effective action, Wigner function, and Boltzmann equation." Physical Review D 37.10 (1988): 2878.

To summarize: The Grassmann variable description serves to cut down the dimension of the field arguments by a factor of two, at the expense of introducing unobservable anticommuting fields that have no intuitive interpretation. Thus Grassmann variables are often very useful but never inevitable.

Note that even on the level of coherent states, one can use in place of Fermionic coherent states with Grassmann variable eigenvalues also ordinary group coherent states based on the orthogonal groups. How to do this is discussed in some detail in the survey article by Zhang, Feng, and Gilmore (Reviews of Modern Physics 62 (1990), 867).

Note also, that in the description of a single Fermionic particle by a Dirac equation, no Grassmann variables appear. They arise only in the second-quantized description of identical Fermions.Thus Grassmann variables are a specific field theoretic phenomenon.

answered Sep 24, 2016 by Arnold Neumaier (15,787 points) [ revision history ]
edited Sep 27, 2016 by Arnold Neumaier

+1 to Arnold Neumaier.

I remember the QED Lagrangian symmetrized for electrons and positrons and before the second quantization it vanishes if $\psi$ is a simple number (or field), but if $\psi$ is a Grassmann variable, then it (Lagrangian) does not vanish and we can proceed to quantizing Grassmann "fields" to get operators in the end.

Careful about saying what it is that is "not needed" - grassman variables can also be understood as a purely algebraic accounting trick, simplifying assorted formulas and calculations. For this, they do not need to be measurable or physical -- they merely need to be handy during pencil-work. The contentious issue is whether supersymmetry exists at "low energies" (accelerator energies).  But one can use grassman variables just fine, without having to assume supersymmetry at all. They're just a handy-dandy device for working with the exterior algebra in certain diff-eq-type settings.

@ArnoldNeumaier What does this statement mean? "There are no classical fermions" I've heard this statement many times before. Please look at the expanded comment to the question.

@ArnoldNeumaier Thanks for the reply. I'll look into this carefully. One more question what are you refering to when you mentioned "microlocal, kinetic description of Wigner matrix fields defined on an 8-dimensional phase space". I have never heard of this. 

@Prathyush: 

Gérard, Patrick, et al. "Homogenization limits and Wigner transforms." Communications on Pure and Applied Mathematics 50.4 (1997): 323-379.

Calzetta, Esteban, and Bei-Lok Hu. "Nonequilibrium quantum fields: Closed-time-path effective action, Wigner function, and Boltzmann equation." Physical Review D 37.10 (1988): 2878.

@Prathyush I found the papers that I read. Hopefully they will be useful for your studies. For the classical limits of fermions, take pauli matrices as an example, as $\hbar\rightarrow 0$, the three matrices go to zero identically. As a result, there are no classical observables corresponding to spin operators.

1. "Spin Geometry" by José Figueroa-O’Farrill  

2. "Classical Spin and Grassmann Algebra" and "Particle Spin Dynamics as the Grassmann Variant of Classical Mechanics" by F. A Berezin and M. S. Marinov

+ 3 like - 0 dislike

Slightly off-topic, but before one talks of coherent states, fermionic or otherwise, there is a very fun paper to consider: "What is Spin?" Hans Ohanian (1985) reviewing work of Belinfante (1939) https://www.physics.mcmaster.ca/phys3mm3/notes/whatisspin.pdf -- in which he computes the Poynting vector with appropriate care, divides by the field intensity, and gets hbar/2.  Its easy to understand (if you've worked with Dirac eqn and Poynting vectors before), direct, feet-on-the-ground, and reaches a remarkable result: spin is a classical pheonomenon, even for the Dirac eqn. Definitely worth a class lecture, if you teach Optics or E&M or intermediate QM.

An update here, for massive spin-1 and massless spin-2 and a remarkable 2014 update for evanescent waves.

What is Ohanian really saying?

What he is saying is that spin is angular momentum. This feels like a tautology, so its worth clarifying the difference. Spin is described by representations of the Lorentz group. That is, a “classical” field can be understood as a (square-integrable) function from Minkowski space to a given, fixed representation of the Lorentz group. The representation defines the “spin” of the field. One might say that a classical field is a section of an associated fiber bundle over the base space of Minkowski space, with the principal bundle having fibers that are the Lorentz group. Thus, the associated bundle has fibers that transform under the Lorentz group. i.e. transform as representations of the Lorentz group. The label on the representation is the spin. 

However, the definition of angular momentum requires the Poincare group, because momentum is defined as the generator of translational symmetry, and angular momentum is defined as the generator of rotational symmetry of the base space. That is, the Poincare group acts on both the fiber and on the base space. To have things actually be consistent, one wants to be able to identify angular momentum with spin, so that eigenstates of the angular momentum operator (which is defined only in the Poincare algebra) correspond to eigenstates of the spin operator in the Lorentz algebra. 

There are two ways to demonstrate this. One way is to consider a single point in space-time, fix it, and then consider the tangent space to that point: as a tangent space, it contains derivatives of the field (derivatives in the space-time directions), and Lie derivatives in the “vertical” space, along the fiber, i.e. valued in the Lorentz algebra. For a fixed spin representation, this is exactly the same thing as having the space-time derivatives be labelled by the Lorentz-algebra spin labels -- this is why fields are always algebra-valued. At this point, one notes that the momentum and angular momentum operators of the Poincare algebra act on that tangent space. Turing the crank, one discovers that spin can be interpreted as angular momentum. This is how most textbooks demonstrate the equality.

Another way to do this is to take a specific section of the fiber bundle, a specific field configuration, and ask what happens there. That is, instead of restricting to the algebra at a fixed point in space-time, one wants to ask about the structure of the function space, the structure of the Banach space of all functions defined on the base space. To answer this, one needs to do what Ohanian does. So, instead of working in the tangent space of a single point in space-time, he chooses to work in the space of all (square-integrable) functions. Yes, one expects to get the same answer, either way, but it is not entirely a foregone conclusion that this would be the case: now that the distinction being made above is explicit, one could certainly create insane configurations where the spin is not equal to the angular momentum.

answered Sep 27, 2016 by linas (85 points) [ revision history ]
edited Sep 30, 2016 by linas
Most voted comments show all comments

Vladimir, instead of thinking about the Dirac eqn, think about the Maxwell equations. These describe vector quantities: the electric and magnetic fields. There is no possibility of them accidentally describing scalars or spinors or some other indeterminate field: they very explicitly describe vectors. As it happens, they are vectors transforming under the adjoint representation of the Lorentz group, and more generally the Poincare group. The history of discovery also does not matter; so, we now have come to a more-or-less full understanding of the representations of the Lorentz group: these include complex-conjugate spinor representations as well as vector representations, as well as many others. It is perfectly fine to ask what happens when one has a field that takes values in a representation, and what the equations of motion for that field might be, if you took them to be invariant under the Poincare group.  The simplest case is to take the simplest invariant, which is the Casimir operator of order two, which corresponds to the Laplacian: this is why equations of motion are usually squares of derivatives.  Of  course, you can take the "square root"; for a vector, this would be the Maxwell equations; for a spinor, this would be the Weyl equations. The Dirac spinor is a direct sum of two Weyl spinors, \(2\oplus \overline{2}\), and is described by the Dirac equation.  All this is very fine, but nowhere in the above, was there any mention of angular momentum. So where does angular momentum come from? Ohanian answers that question!

One can also relate spinors and vectors to angular momentum the way that most textbooks do: that is, by hand-waving. It is an OK way of doing things, but it requires the student to use their imagination when thinking about the "spin" of a "particle".  In fact, the majority of students, and many professors get this imaginative aspect dead-wrong, and this is evident in how they talk about the "spin" of "particles". We can minimize these errors of understanding by being more rigorous about what we mean by the "angular momentum of a field". That rigor is exactly what Ohanian is providing. But if, instead you cling to the (misleading) hand-waving  arguments about spin, you will suffer two errors: mis-understanding what it actually is, and then thinking that what Ohanian does is somehow tautological. Again: he provides the full, formal, correct proof of what the "angular momentum of a field" is, and any other argument is just hand-waving.

Re-reading your earlier comment: I now see what you were thinking of when you wrote "you cannot decide how many components the classical field has (is it a scalar field, spinor field, vector field, etc.?)." -- this is exactly the property of a Casimir operator: it is that thing that distinguishes the number of components. The fact that one uses the quadratic Casimir operator to build the Laplacian is exactly why all reps can solve the Klein-Gordon eqn, which "doesn't care" how many components there are.  But a vector is not the same thing as three scalars; they transform differently. Yukawa bosons, spinors and Maxwell vectors are all classical fields; they have spin 0, 1/2 and 1 respectively. To see that spin is the same thing as the angular momentum, you have to ask how the fields transform under the Poincare group.  You can do this abstractly, as algebras, as is done in textbooks; but then you get a proof for algebras, and not for actual field configurations, i.e. not for actual wave functions. It is not surprising that results agree: we want the Hilbert space of square-integrable functions to posses the same algebraic properties as the space on which the functions live: i.e. we want the fibers of the fiber bundle to have the Lorentz-group symmetry.  But in addition to the symmetry group of the fibers, we also want the Poincare group, acting on the base space, because it is the Poincare group not the Lorentz group that defines what angular momentum is.  To recap these thoughts: the Lorentz group defines spin, the Poincare group defines angular momentum. What Ohanian demonstrates is that the definition of spin is consistent with the definition of angular momentum. Which is good news, but is not necessarily a tautology. Huh. Thanks for forcing me to explain this; I learned something.

Vladimir: you cannot call something "momentum" until you define what momentum is. To define momentum, you need spacetime derivatives: tangent vectors that lie in the spacetime directions. You need a (pseudo-)Riemannian manifold, such as Minkowski space, to define position, momentum and angular momentum

The group SL(2,C) is a Lie group. It is defined completely independently of Minkowski space: it is simply the group of invertible 2x2 complex matrices of unit determinant. That's all.  It just floats out there in platonic-math-land.  It has no spatial directions, it has no time directions. Yet it is the group SL(2,C) that defines what "spin" is.

Somehow, you have to connect the notion of spin, as defined by SL(2,C), to the notion of angular momentum, which can only be defined for spacetime (or for (pseudo-)Riemannian manifolds in general).  Angular momentum doesn't "exist" when there is no spacetime for it to exist inside of.  On the other hand, "spin", (and representations of Lie algebras in general) is perfectly well-defined without having any notion of spacetime whatsoever.  

I do not think that grassmann variables can always be treated as auxiliary variables. And mathematics is the language of the nature. 

@Vladimir Kalitvianski No one wants to argue here. I do not know much about mathematics. I simply quoted the sentence from a professor working at Harvard.  

@XlaoyiJing: Then forgive me for my accusing you in downvoting. I was thinking that your comments objecting mine's were compliant with downvoting, sorry.

Most recent comments show all comments

@XlaoyiJing: Too bad then. Too bad. And you hurry to downvote. Apparently you are impressed with a Harvard mathematical axiom "mathematics is the language of the nature".

@Vladimir Kalitvianski Vladimir, I didn't do that. I voted up one of your replies.  

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsO$\varnothing$erflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...