Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,355 answers , 22,793 comments
1,470 users with positive rep
820 active unimported users
More ...

  Why is the exterior algebra so ubiquitous?

+ 12 like - 0 dislike
8699 views

The exterior algebra of a vector space V seems to appear all over the place, such as in

  • the definition of the cross product and determinant,
  • the description of the Grassmannian as a variety,
  • the description of irreducible representations of GL(V),
  • the definition of differential forms in differential geometry,
  • the description of fermions in supersymmetry.

What unifying principle lies behind these appearances of the exterior algebra? (I should mention that what I'm really interested in here is the geometric meaning of the Gessel-Viennot lemma and, by association, of the principle of inclusion-exclusion.)

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Qiaochu Yuan
asked Oct 21, 2009 in Theoretical Physics by Qiaochu Yuan (385 points) [ no revision ]
retagged Dec 7, 2014

12 Answers

+ 10 like - 0 dislike

Here is an answer which would make sense to an elementary school student—if they understood what you were asking. It's Grassmann's original argument for considering anticommutativity. I don't have a reference handy, but I'm pretty sure it shows up in the intro material to one of the Ausdehnungslehre, or perhaps a summary essay.

Grassmann's goal was to find a way to "arithmetize" geometry. So, let's do that very naively. Suppose you have a line segment AB and another collinear line segment BC:

A---------B----------C

Then, through visual inspection, AB + BC = AC. However, now suppose C lies in the middle instead of B:

A---------C----------B

Writing down the obvious equation from this arrangement we get AC + CB = AB

If we solve the resulting system of two equations, we realize that BC = -CB

Anticommutativity!


An additional thought for Qiaochu, have you looked at Klein and Rota's book Introduction to Geometric Probability? There are some interesting analogies there between combinatorial structures and geometry that might give you some thoughts. In particular, they link inclusion-exclusion and the Euler characteristic as the unique 0-dimensional invariant valuations in the combinatorial and geometric settings respectively.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Gilbert Bernstein
answered Aug 26, 2010 by Gilbert Bernstein (100 points) [ no revision ]
??? Are you sure that this is Grassmann's original argument? This seems highly suspect to me: for one thing $AB$ is not a product. Can someone else confirm or, better yet, point to a reference for this?

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user José Figueroa-O'Farrill
Jose: He's right. For a reference, see section 7 of Fearnley-Sander's Hermann Grassmann and the Invention of Linear Algebra. Whether $AB$ is a product or not depends on what meaning you invest in it. It 'is' a product in the same sense that $v \wedge w$ in the usual picture of exterior algebra 'is' the parallelogram spanned by $v$ and $w$.

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Per Vognsen
José, I don't mean that as a formal argument and neither did Grassmann from what I recall. He used the argument as motivation for studying the antisymmetric tensor product rather than just the symmetric or free products. After making the assumption, I believe he proceeded in a more formal fashion.

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Gilbert Bernstein
Surely, a simpler argument along these lines (pun not intended) is that the line segment AA is the empty sum of consecutive line segments; thus, self-multiplication is zero.

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Sridhar Ramesh
If one considers points $A,B,C$ as elements of affine space (or plane) $E$, then the oriented segment $AB$ identifies naturally with the exterior product $A\wedge B$ (say as the cone on the segment) in the exterior square of the vector space extension of $E$ (formal linear combinations of points with sum of coefficients not necessarily 1, with obvious relations).

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user BS.
+ 9 like - 0 dislike

Just to use a buzzword that Greg didn't, the exterior algebra is the symmetric algebra of a purely odd supervector space. So, it isn't "better than a symmetric algebra," it is a symmetric algebra.

The reason this happens is that super vector spaces aren't just Z/2 graded vector spaces, they also have a slightly different tensor category structure (the flip map on the tensor product of two odd vector spaces is -1 times the usual flip map, and the usual flip map for all other pure vector spaces). If you look at all the formulas from homological algebra, for things like how to take the tensor product of two complexes, they always have a bunch of weird signs showing up; these always can be though of as coming from the fact that you should take the tensor product on graded vector spaces inherited from super vector spaces, not the boring one.

Of course, this just raises the question of why supervector spaces show up so much. Greg had about as good an answer as I could give for that.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Ben Webster
answered Oct 21, 2009 by Ben Webster (150 points) [ no revision ]
Can you be a little more precise about the connection between supervector spaces and chain complexes?

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Qiaochu Yuan
Chain complexes are always naturally supervector spaces. Take the direct sum of the even degree components as the degree 0 space, and the direct sum of the odd degree components as the degree 1 space. Then, the endomorphisms of the chain complex are naturally also a supervector space, with the boundary being an odd degree element. This explains the artificial-seeming procedure I mentioned for making boundaries anti-commute. When trying to have d_S act on R^i\otimes D^j, the flip map Ben mentions will introduce a minus sign if i is odd.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Greg Muller
More or less this was my answer, so I'll just vote it up. [Actually, when I looked at this page, this answer had 4 votes. When I clicked the up arrow, it changed the number to 0, and then let me click it up to 1. So shrug.]

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Theo Johnson-Freyd
+ 7 like - 0 dislike

For me, the exterior algebra is the free polynomial algebra in anti-commutative variables. Of course, this begs the question, why do anti-commutative variables come up so much?

As a homological algebraist, the reason for this that jumps out at me is that the boundary map d in a complex is an anticommuting operator, which can be seen in the Koszul sign rule for commuting boundary maps across each other. Of course, this doesn't really explain all the instances of anti-commuting variables.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Greg Muller
answered Oct 21, 2009 by Greg Muller (70 points) [ no revision ]
That sounds like as good a unifying principle as any. Can you give some more details?

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Qiaochu Yuan
Sure, here's the thought process. Take two complexes R^* and S^*, with the goal of making their tensor product a complex. This can be done by letting (R \otimes S)^i be the direct sum of R^j\otimes S^{i-j}. However, the desired boundary map, d_R + d_S, naively squares to d_Rd_S+d_Sd_S, which is 2d_Rd_S (because the two differentials commute). Clearly, d_R +d_S would be a boundary if the differentials anti-commute. The way this is usually done is by defining the action of d_S on R^j\otimes S^i to be (-1)^i times the action of d_S on S^i. This forces d_S and d_R to anti-commute.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Greg Muller
This also gives a rather odd reason why the differential on a complex should square to zero... it anticommutes with itself, and so it should square to zero.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Greg Muller
Away from the very concrete aspects of algebraic geometry, my algebra's shaky, so I don't know if this follows from what you said, but of course anticommutativity's available wherever Lie algebras are sold.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Harrison Brown
I should add that even when doing group cohomology, where there isn't any obvious exterior algebra, the operations on the cup product look very similar to the wedge product of, say, differential forms (you have the anti-commutativity which depends on the dimension of the two things you're taking the product of).

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user David Corwin
+ 5 like - 0 dislike

I will only answer for the link between determinants, differential forms and the Grassmannian.

The fact is that determinant, up to a sign, represents the volume of a parallelepipedal having n assigned vectors as sides. The sign is determined by orientation of this solid.

Indeed the axioms for the determinant can be translated geometrically: for instance the fact that the determinant vanishes when two columns are equal corresponds to the fact that a solid lieing in a hyperplane has 0 volume.

Now take a a linear map f expressed by a matrix A: the image of the unit cube is the solid generated by the columns of A; so f stretchs volumes by a factor |det(A)|, by the previous remark.

This is the infinitesimal expression of the usual formula for the change of variables in the integral, and it is the reason why the Jacobian determinant appears there. It is just the infinitesimal factor by which you multiply volumes. I hope this gives a rough explanation why the determinant appears in this formula.

Now to differential forms. Assume you want to integrate a quantity on a manifold, say a function. You may want to try to integrate it in local coordinates, but the result will depend on the coordinates chosen. So in order to get something well-defined you need a quantity whose local expression changes by the inverse factor (ok, I'm neglecting orientation here). This is exactly a n-form, whose local expression changes by the determinant of the Jacobian of the inverse change of coodinates.

This vague discussion should so far give an idea why differential form of maximal degree are apt to be integrated on oriented manifolds. Now choose a manifold M. You can integrate k-forms on M on k-subvarieties of M, so differential forms of any degree appear as dual elements of subvarieties of the corresponding dimension. Pushing this correspondence a bit explain why the complex of differential form gives the cohomology of M. But this is a topological invariant, so it has plenty of other constructions.

So we get an analytic tool (differential forms) which describes part of the topology of M; something which of course is worthy studying. Feeew!! If you got this far, you can understand what kind of link I see between determinants and differential forms.

As a particular case, this also give an explanation of the link with Grasmmannian: to a given subspace A you just associate the (constant) differential forms dual to it, up to multiples; this allows you to think of point of the Grassmannian as a point in a projective space, giving (more or less) the usual Plucker embedding. I mean: dual elements to general subvarieties are noncostant differential forms, but if you just restrict to subspaces you can just use costant differential forms.

I don't have an intuitive explanation of the link with irreducible representations of GL and I don't know Fermions, so I can't help you there.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Andrea Ferretti
answered Oct 21, 2009 by Andrea Ferretti (50 points) [ no revision ]
I suppose part of the question I didn't explicitly ask is: why is the exterior algebra more natural to look at than the symmetric algebra?

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Qiaochu Yuan
You don't think polynomial algebras are ubiquitous? :)

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Reid Barton
Oops; I guess I meant "for the purposes of differential geometry."

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Qiaochu Yuan
Exactly for the reason I said. Sections of the maximal exterior power of the tangent bundle have a local expression which changes by a determinant. This, combined with the formula for the change of variables in the integral, allows you to integrate such sections over oriented manifolds, something you cannot do with symmetric tensors. This is why differential forms hold an immediate geometric meaning.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Andrea Ferretti
+ 5 like - 0 dislike

One good reason for the ubiquity of the exterior algebra construction is that it has nice basic properties (which if made precise will uniquely define it):

  1. It is a functor from vector spaces to (strictly) supercommutative algebras.
  2. Direct sums are taken to tensor products of algebras.
  3. It plays well with base change and descent, i.e., one can make exterior algebra bundles by gluing.
  4. It takes a line to an odd line plus an even line (used to great effect in Torsten's answer here - note the last paragraph comparing exterior with symmetric cases).

Bonus properties in the finite rank case include:

  1. There is a multiplicative determinant subfunctor to invertible objects (read: graded lines).
  2. It yields a Hopf algebra.
  3. You have a perfect "Hodge dual" pairing valued in the determinant (as pointed out by Marc Nieper-Wißkirchen).

Regarding the ways the exterior algebra is useful when the symmetric algebra is not, I think all of the applications you listed revolve around the finite rank properties - in particular, the distinguished nature of the determinant as a canonical one-dimensional tensor. The only one-dimensional symmetric tensor is the trivial one, which carries no information. Any attempt to make things like volumes, cup products, or Hodge stars requires an orientation, which can be viewed as a determinant (see e.g., my answer here).

As Wikipedia mentions, exterior algebras satisfy a universal property: for any linear map from a vector space $V$ to an associative algebra $A$ landing in the square zero subspace, there is a unique algebra homomorphism from $\bigwedge V$ to $A$ making a certain triangular diagram commute. This yields a description of the exterior algebra functor as a left adjoint to a forgetful functor from strictly supercommutative algebras. If 2 is invertible, then it is equivalent to the parity-shifted symmetric algebra functor, but in general, it represents a genuinely different functor. In particular, the fact that determinants exist in characteristic 2 is an indication that the exterior algebra is more important than the shifted symmetric algebra.

I should emphasize that supersymmetry is different from the existence of fermions. In short, fermions are just odd fields that transform a certain way under ordinary spacetime symmetries, but supersymmetry is the specification of additional odd symmetries of spacetime. This is substantially more exotic: theories containing fermionic particles (like electrons) can exist without supersymmetry, and in fact existed happily for about 50 years before supersymmetry was hatched.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user S. Carnahan
answered Oct 21, 2009 by S. Carnahan (190 points) [ no revision ]
+ 5 like - 0 dislike

If the exterior algebra was just the symmetric algebra (up to a degree shift), it wouldn't be a useful notion by itself.

But to me it seems it isn't: Consider an ordinary vector space V over a field k of characteristic 2. For simplicity lets assume that V is one-dimensional with generator $x$. As $x \wedge x = 0$, the Graßmann algebra of V is k in weight 0 and 1, and trivial in all other weights.

Now change the parity of the vector space V, i.e. consider x to be of odd degree. Let us call the resulting odd vector space $W$. What is the symmetric algebra over $W$? Neglecting the parity, it is simply the polynomial algebra k in one variable, which is k in all positive weights. The discrepancy comes from the fact that in characteristic 2, the symmetric algebra does not see the difference between even and odd elements.

As to the ubiquity of the exterior algebra: Given an operator A acting on any space, it is quite natural to ask whether $A \circ A = 0$ (e.g. any would-be differential in homological algebra). Whenever I have a vector space of operators that has this property, the exterior algebra shows up. And such a situation looks quite common.

Furthermore, the exterior algebra enjoys a property the symmetric algebra (of the shifted space) does not have: If V is free of rank n, the natural pairing $\Lambda^p V \otimes \Lambda^{n - p} V \to \Lambda^n V$ is a perfect pairing.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Marc Nieper-Wißkirchen
answered Nov 24, 2009 by Marc Nieper-Wißkirchen (50 points) [ no revision ]
I'm reasonably certain that the definition of graded-commutativity can be repaired to give the right answer in characteristic 2.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Qiaochu Yuan
My guess is that not the graded-commutativity has to be modified in char 2 but that the definition of the symmetric algebra I am using (as the space of coinvariants by the action of the symmetric group) is simply not the right (or good) thing to look at in char 2. I am going to make this into its own question.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user Marc Nieper-Wißkirchen
+1: This answer deserves more votes (and possibly acceptance). It is both mathematically correct (in contrast to most others) and emphasizes key features like the square zero property and the perfect pairing that others have not captured.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user S. Carnahan
+ 2 like - 0 dislike

In my opinion, the unifying object of all the cases you mentioned is the Dirac-operator:

  1. The Dirac operator acts on the exterior (possibly twisted) algebra of differential forms.

  2. Some cases in representation theory (where GL(V) is a special case) can be formulated using the Kostant-Dirac operator.

  3. Fermions satisfy the Dirac equation.

  4. The (infinite dimensional) Grassmannian in second quantization can be constructed from the (one-particle) Dirac spectrum (above and below thw Dirac sea).

  5. Finally, the determinant, can seen as the Jacobian of a linear transformation of Grassmann variables which are "classical" (in the Berezin's sense) counterparts of fermions.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user David Bar Moshe
answered Oct 22, 2009 by David Bar Moshe (4,355 points) [ no revision ]
+ 2 like - 0 dislike

As Gilbert said, Grassmann's original intention was to arithmetrize geometry, and I think the exterior product captures this quite nicely.

The intuition is that for any vector space $V$, the exterior produt $\Lambda^k V$ corresponds to the $k$-dimensional subspaces of $V$. In other words, the amazing insight of Grassmann is that subspaces can be captured by an algebraic product, at least to some extend. Namely, consider two sets of vectors $v_1,\dots,v_k$ and $w_1,\dots,w_k$. These sets span the same $k$-dimensional subspace in $V$ if and only if

$v_1 \wedge \dots \wedge v_k = \lambda \cdot w_1 \wedge \dots \wedge w_k \neq 0$

in $\Lambda^k V$. Put differently, a pure wedge product of vectors can be identified with their linear span (if it has full dimension).

This view perfectly explains the first four points of the question:

  • The extra factor in the wedge product can be interpreted as a measure of volume.
  • Reformulation of the intuition above.
  • Elements of $GL(V)$ also permute the higher-dimensional subspaces of $V$.
  • Differential forms = infinitesimal subspaces + volumes.

The fermions don't fit in, their only relation to the above seems to be their antisymmetry.

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Greg Graviton
answered Jun 18, 2011 by Greg Graviton (775 points) [ no revision ]
I am confused. What if $k=2$, $v_2=v_1$ and $w_1=w_2$?

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Yemon Choi
@Yemon: Ah. They must not be degenerate.

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Greg Graviton
+ 2 like - 0 dislike

I think that what unifies some of the different examples of when the exterior algebra occurs is that it is the structure that transforms the action of a commutative ring on a module (or, more concretely, the action of several commuting linear operators on a vector space) into a chain complex. The ubiquitous structure is really commutative rings and modules over commutative rings. The exterior algebra actually encodes commutativity, in a sense. (One might then ask why is it that mathematicians love so much commutative rings, commuting operators, etc.)

When tensoring a commutative action with the graded algebra one gets the Koszul complex, and the anti-commutative nature of the graded algebra is precisely what, when coupled with the commutative action of the ring, makes possible the definition of a dirac operator $d$ with $d^2=0$.

I got these ideas after thinking about Joseph Taylor's paper, "A joint spectrum for several commuting operators", J. Funct. Anal. , 1970.

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Orr Shalit
answered Jun 18, 2011 by Orr Shalit (20 points) [ no revision ]
Your sentence "The exterior algebra actually encodes commutativity, in a sense" can be made precise: the exterior algebra encodes commutativity in the sense that the minimal resolution of polynomial rings "is" the exterior algebra. This is nowadays framed in the context of Koszul duality.

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Mariano Suárez-Alvarez
I have to admit I don't understand chain complexes on a completely intuitive level either. There is a circle of ideas here that is tightly linked and it seems that if I understood one idea here I would understand them all, but...

This post imported from StackExchange MathOverflow at 2014-12-07 12:38 (UTC), posted by SE-user Qiaochu Yuan
+ 1 like - 0 dislike

Somehow, I can't resist (re)formulating an answer in the following minimalistic and tautological way: the exterior algebra appears each time you consider the tensor algebra generated by a vector space (e.g., you want do define a notion of volume for n-parallelepipeds spanned by n-tuples of vectors) and you want to quotient w.r.t. the ideal generated by products of the same vector (you want flat parallelepipeds to have zero volume).

It's an "I won't repeat myself" statement which, in its peremptory simplicity, is likely to appear in the early evolutionary stages of many mathematical ideas.

This post imported from StackExchange MathOverflow at 2014-12-07 12:37 (UTC), posted by SE-user pasquale zito
answered Oct 22, 2009 by pasquale zito (10 points) [ no revision ]

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysic$\varnothing$Overflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...