Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  Is the path integral in 4 dimensions still only at the level of useful heuristic?

+ 2 like - 0 dislike
1774 views

The question

Is the path integral in 4 dimensions still only at the level of useful heuristic?

came up as a byproduct of a somewhat unrelated discussion; the comments were moved to here.

asked May 12, 2015 in Chat by SchrodingersCatVoter (-10 points) [ revision history ]
edited May 12, 2015 by Arnold Neumaier

@ArnoldNeumaier: It's not that I don't like the way algebraic QFT "transcends" this limitation of a single vacuum, it's just that AQFT is not transcending anything, as I see it. It is simply defining properties of quantum fields formally, without any construction. 

The result is a formal and involved way to avoid speaking about path integration. To demonstrate--- you don't even know the vacuum manifold. There are situations where the space of vacua is not related by symmetry at all, either in fine-tuned ordinary field theory, or most naturally in SUSY theories. In this case, you can't guess the space of vacua from the broken symmetries, you need to know the moduli space from the exactly flat directions in the effective potential. No symmetry moves you along these, these are pure moduli. For a simple example, consider the statistical mechanics of a nonrelativistic 3d membrane, doing Langevin dynamics along a forth dimension, and stuck in a potential which is not translationally invariant, but has compact support. The stochastic equation for this has superselection sectors, defined by the configuration of the membrane at infinity, which are not related by symmetry

There is no substitute for path integration, because it actually constructs all the operators properly, and there would be no obstacle to rigor if you do rigorous mathematics correctly, that is, allowing probability arguments as equal citizens of the universe of mathematical arguments. The avoidance of probability can be traced to the probability paradoxes of the early 20th century, which involved constructions of non-measurable sets. These are avoided in modern set theories, because we know how to construct probabilistically consistent set theories since Solovay. In these universes, measure theory simplifies, and there are no headaches in making arguments about infinite dimensional measures which you can define algorithmically. The only successful rigorous constructions of field theory, ignoring 2d where there are easier methods, from Symanzik inspired work on, were heavily path integral based.

The methods people are using to avoid path integrals are exactly analogous to the contortions people took in the 17th century to avoid infinitesimal calculus, and they are just as wrong. You want a formalism to define the path integral, not a formalism to sidestep it. For real actions, so the case of QCD, Wilson provided a sketch for such a formalism, whose only obstacles are technicalities, and the number of steps required to make it rigorous is less than the number of steps required for any substitute, because the AQFT stuff is ultimately empty without the examples that come from path integrals.

I know I will not persuade you, but in this case, it is not me that has an irrational fear of AQFT, it is the AQFT practitioners that have an irrational fear of path integrals, because the naive interpretations don't work. That's not a problem with path integrals, it is a problem with naive interpretations and with rigorous mathematics. A proper theory of measures, filtrations, and probabilistic sampling theory should have no problem either with infinite volume lattice measures, nor with the limiting arguments that define their distributional continuum limits.

Except in 2D where there are localization techniques, path integral techniques do not construct a Hilbert space. It is just expected to be there by analogy, but no one ever found a mathemaically convincing argument proving it.  Hence everything remains questionable from a rigorous point of view. Of course, by sacrificing rigor one arrives far more quickly at conjectures and approximate numerical results, and this is why it is used by the majority of theoretical physicists - but it remains just that: conjectures and approximations.

No interpretation of the path integral has been found to work rigorously in D, not even in the probably simplest case - that of Yang-Mills theory. Otherwise the clay millennium problem wouldn't have been posed, or would have been settled immediately after the price was set. But you saw yourself how far away the intuition about lattice YM and path integrals is from a proof that it actually works. Balaban's work required hundreds of pages to make pat of the intuition rigorous, and to continue the path he paved requires overcoming obstacles that impeded even the slightest progress in the 20 years since his work.

Some of the proper theory of measures, filtrations, and probabilistic sampling theory that you are asking for (and seem to find too easy to try your hands yourself on it) was created by the fields medalist Hairer, but even he managed only to cover the superrenormalizable case - which means QFT in 3D (where rigorous constructive results were already known before him) rather that the challenging 4D case.

You don't need to persuade me but the community of mathematical physicists, who set the standards.

 The path integral approach simply ignores these difficulties by lowering the standards to handwaving arguments. It is like doing analysis on the level of the 18th and early 19th century before Riemann and Lebesgue showed how to set up analysis according to modern standards of rigor.

@ArnoldNeumaier: I disagree with you fundamentally, and in a way that produces in me a sort of irrational rage. But the most productive way to channel this anger is to do some rigorous nonsense.

But first, I have to say that the path integral constructs the entire theory in those cases where you can take the continuum limit by finding a second order transition point in the lattice approximations. When it has this property, it is not at all at the level of a useful heuristic, it is a limiting conception for statistical sampling, exactly analogous to the limiting conceptions for deterministic processes in calculus. In cases where we can find the second order point and see the flows away from these, the path integral must be well defined. But, unfortunately, we as human beings, are not adept at turning the limiting procedure into a proof very well rigorously, because of limitations in our current formal methods of proof. But this is due to a flaw in our methods of proof, which I will explain below, there is no chance in heck that the limiting procedure doesn't work.

The difficulties in proving anything about this limiting procedure for path integrals in this case is largely a product of human social evolution regarding mathematics, which rigorous arguments we are adept at and admit as easy to formalize. Outside of logic, especially in geometry and probability, there is very little mathematics which is completely rigorous in the sense of actually giving a reasonable sketch of what the formal proof of the results would look like in a system like ZFC (the exceptions are textbooks proofs of very simple foundational results, like the Hahn Banach theorem, and these theorems have been input into a proof verifier). Many results in the literature are rigorous by tacit convention. Although it is clear to all that the embedding into a formal system would eventually succeed, it would take a major research effort were you to try to input Perelman's proofs, or Smale's, into Coq, This doesn't cause problems is because it is just understood in mathematics that certain arguments could be made rigorous if enough effort were put in. In a similar way, it is understood in physics that path integrals emerging from second-order phase transitions could be made rigorous if enough effort was put in, even though the efforts to date have been substandard.

I identify one of the reasons for this difficulty below, it is the intuitive model people use for set theory. Within this framework, a simple paradox can be demonstrated as follows:

Consider R as a vector space over Q, and pick a basis for it (the existence of such a basis is guaranteed in ZFC). Now pick a Gaussian random real x of unit variance, and an independent identically distributed Gaussian random real y, and consider z=(3x+4y)/5. Now, x, y, z are all identically distributed Gaussian random variables of unit variance (z depends on x and y). Consider now the probability P(n) that x is made up of n basis elements. The probability that y is made up of m basis elements is also P(m), because y is identically distributed as x. The probability that x and y share basis elements is zero, because if you pick y first, the finite dimensional Q vector-space spanned by the basis elements is countable and has zero measure, so x lies outside it. So z is made up of m+n basis elements.

But z also must have probability distribution P(r) to be made up of r basis elements! This is a contradiction, because r is the sum of m and n, and it is impossible for the probability distribution of a sum of two IID integer random variables to be the same distribution. To see this, consider that m could be 1, and then n could be 1, but then r must be at least 2, and so on, for whatever lower bound you put on the first integer value n where P(n) is nonzero.

This is a paradox, a conflict between two principles. One of the principles is measurability, the ability to choose numbers at random and speak about them meaningfully as elements of arbitrary sets in the universe, i.e to apply arbitrary set operations to random variables. This principle conflicts with is set theoretic choice applied uncountably many times, which allows you to inductively pick a basis for R over Q. The simplest contradiction is "what probability does a random number x have of belonging to a Vitali set". So one of these two principles has to go, and whichever path you choose determines which class of arguments become easy and which class of arguments become hard.

The path I personally choose is the nonstandard one: I choose to make probability work properly, and so I declare that this argument is a rigorous proof that R does not have a basis as a vector space over Q. But this becomes a rigorous proof in what formal system? Certainly not in ZFC! The opposite is true in ZFC. In this conception, to be perfectly formally rigorous, you need a set theory where every subset of the interval [0,1] is measurable. Solovay provided the natural forcing construction for such models in 1972, Solovay's forcing is simply a formalization of the logical statements that give the properties of randomly chosen numbers. The basic forcing theorem shows that one can construct these set theories with adjoined random numbers, and further, by adjoining a set to represent the universe (which is the inaccessible cardinal) and suitably truncating the extension, we can produce a set theory where every subset of R is measurable. so we know that these set theories are perfectly good logically, and in terms of logical strength, they are equiconsistent with a standard not particularly large cardinal extension of ZFC, so these set theories are equally well defined as standard set theory.

But there is another path you can go on, which is to declare that this argument is invalid as a rigorous proof, because you are not allowed to speak about the random variables x and y as real numbers with definite values. Rather, if you want your argument to be rigorous, you are only allowed to ask questions of the form: what is the probability that x and y are members of a certain restricted universe of sets, the measurable sets. Then the proof above, instead of showing that R does not have a basis over Q, shows that the sets of reals with n basis elements are not measurable. This means that the operation "decompose x into a basis" does not make sense for a random variable, even though it makes sense for any real number. The random variables are simply second class citizens of the real number line, if you have a function from R to another set S, it can't automatically be applied to a random variable.

This second path is unfortunately the path that was taken historically. This means that the natural arguments that people make with probability immediately became void, because you needed to go through a slog to show that the operations involved on infinitely many choices are the kind that produce measurable sets. The struggle with integration and probability in the 1930s and 1940s was trying to make as much of the intuitive arguments of probability go through in this terribly contorted framework, where random variables were not allowed to have values, and certain operations which can be performed on real numbers, like decomposing into a basis over Q, cannot be performed on random variables, while every natural operation on real numbers, any operation you can define by predicates not involving choice functions, can be carried out.

The boundaries of rigorous arguments are therefore defined by which results of intuitive probability have to date managed to sneak past this artificial barrier. Not all the intuitive results have made it, and this makes it a pain to do probability.

To give the example where this shows up in Balaban's work, there is a notion of cluster expansion on sublattices which defines the cluster coefficients of the probability measure on a coarse lattice from the probability measure on a fine lattice. The cluster coefficients in the case of the Ising model are the coefficients of the 2n-spin products in the statistical Hamiltonian, and these coefficients form a Banach space and under iteration of RG transformations, they flow away from the Wilson Fisher fixed point, and this is how you establish the existence of the critical long-distance limit of the 3d $\phi^4$ theory. The short-distance limit is controlled by the free field behavior at short distances, or the $\lambda=0$ repelling point.

Because the probability theory is fundamentally damaged in the way I described, this cluster expansion is only defined in Balaban and Jaffe for finite volume lattices. The coefficients in this expansion don't care at all about the volume of space-time when the volume is large, the probability measure is completely uniformly defined on infinite volume limits with absolutely no change in the coefficient map involved, but you can't speak about the integration theory on this space in the Balaban and Jaffe formalism, not without explicitly taking the infinite volume limit at the end, because you are not allowed to speak about the measure on infinite lattices in the same way as on finite lattices. I hate Balaban and Jaffe constructions, because they are restricted to finite lattices in this artificial way, because of these artificial limitations. If mathematics had developed probability properly, the cluster expansion would be defined uniformly on infinite and finite lattices both.

This obstacle is NOT the main problem in using Balaban's work to prove real existence in finite volume and real mass gap (although it is really why Balaban's formalism isn't set up to do it--- he considers finite volume transformations so that his integrals are finite dimensional, due to the "technical difficulties" of starting with infinite volume--- the technical difficulties are the measures on infinite lattices, and they don't exist in measurable universe). The main problem is that Balaban simply proves stability bounds, meaning that the coefficients K wander around in a compact set under RG transformations, so he is only interested in continuum limit without knowing anything about the long-distance properties of this limit. This allows you to more or less formally prove something along the lines of the existence of a continuum limit, in a certain weak sense, but it doesn't prove anything about the infrared limit, let alone the correct result that all the coefficients of the cluster expansion converge to 0 at long distances, so that the long wavelength lattice action is 0. But although Balaban doesn't prove this, it is certainly true.

Hairer's stuff does not solve this problem of infinite volume measures, it sidesteps it, by using results from the field of stochastic PDEs. In stochastic PDE's, mathematicians as far as I can see, by tacit convention, simply never worry about measure paradoxes, simply because they can "see" that with a local differential equation where there is a local solution for real numbers, then "obviously" there is a local solution for random variables, ignoring the fact that you aren't allowed to do the same things for random variables as for real numbers in standard set theory.

If the differential equation is local, or has controlled propagation with good real-valued bounds, it doesn't matter to them that there is an infinite volume somewhere else, so they simply extend all results to infinite volume whenever the real-number theorems work, without bothering to make the argument rigorous in the set theoretic sense! If they wanted to do that, they would need to prove rigorously that the sets involved in their constructions are measurable, or they would end up with variations on the same paradox.

There is nothing in Hairer's work which constructs the filtrations, or any of that, what is required for understanding infinite volume limits in a measure theoretic sense. He is rather working in a field where the convention for rigor simply ignore set theoretic difficulties, and he can focus only on the real issues, namely the estimates for proving that the solutions converge as the regulator is relaxed. There is nothing wrong with that, of course, non-measurable sets are an abomination, but I think it is important to redo the set theory so that these issues disappear forever.

I should say that the conventions in mathematical physics also largely ignore the set theoretic difficulties in turning the arguments really rigorous, precisely because it is too much of a hassle to consider random variables as somehow having restricted operations, and worrying about proving sets are measurable. People sometimes even argue using cluster expansions in infinite volume, thinking it is "obviously" the limit of the finite volume case (this is true) without really worrying about whether the cluster expansion is well defined, and the question of whether an argument involving probability will be considered rigorous or not is more or less a political crapshoot as far as I can see, depending on who you cite, and what kind of language you prefer to use.

If you agree to ignore issues of measurability, then one can explain the limiting procedures in path integrals extremely straightforwardly. You define the Monte-Carlo algorithm which picks a random field, this is an algorithm with random bits, so it defines a map from [0,1] into a space of lattice fields. The existence of a second order point guarantees a continuum limit, and the existence of an infrared attractor with 0 action guarantees mass gap.

The main issue I had with turning this rigorous using Balaban's method is that Balaban does not use gross link-variables, rather he averages over tubes of neighboring link-paths. If this is necessary, the argument becomes exceedingly complicated, because you can't average over neighboring paths in a gauge invariant way. I am not sure if this is necessary or not.

@ArnoldNeumaier: The above comment is designed to be simply a response to your infuriating claim that the path integral is still only at the level of useful heuristic. This claim is false after Wilson, despite the difficulties that remain in the rigorous construction of gauge theory in 4d. You further claim is that the length of Balaban's argument shows that the amount of work required to make Wilson's arguments formal is exceedingly large. This is not precisely true, although it is true in the work done to date, because the length of Balaban's argument is not determined by the quality of Wilson's argument. It is mostly determined by the requirements of working in finite volume with a defined cluster expansion, and the separate requirement in Balaban's method of averaging over neighboring flux tubes to define the renormalized link-matrix. These things expand the length of the argument inordinately, and weaken the conclusion, without changing the fundamental Wilsonian insight at all, or its correctness.In the 3d scalar field case, where there are no link-variables, Balaban's method is simple and effective, and the difficulties introduced in 4d gauge theory are technical and have nothing to do with the soundness of the method in principle. 

I don't want to point out a problem without a solution, so when I ask a self-answered question regarding this, the proper question should be "what is an appropriate set theory to embed a measure theory which treats infinite volume lattices on the same footing as finite volume lattices" or something along these lines. I have thought about constructing this set theory for a while, usually fruitlessly, but I now suspect I can give a list of axioms for this theory. If you do it right, it will be useful not just for physics, but for pure logic, to allow different models of set theory to coexist inside a universal set formalism which is completely model-agnostic, which does not make any axiomatic statements which favor one type of set-theory model over another. This requires a careful formulation, to avoid paradoxes, and to prove equiconsistency with standard theories, so I am still hesitant for a short time. When I am more confident of the proper forms of the axioms, I will post it. But you can't just post axioms and expect to be taken seriously, you need to show how to embed standard models of ZFC, like V=L, so that all the classical results can be used in such a system (with proper qualification and interpretation), you need to make sure that that the system is at least as intuitive as standard set theory, and that nonstandard models like Solovays are equally easy to embed and work with simultaneously and uniformly. Then the integration theory on the reals would allow you to apply any operation on real numbers to random variables. Most of the real mathematical work was already done by Solovay in the original paper, so this is not likely of interest to logicians, who already know that Solovay's model solves the intuitive issues with probability and set theory coexisting. But that doesn't mean that the result has escaped from the logic ghetto.

See also this question.

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOverf$\varnothing$ow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...