# Rigor in quantum field theory

+ 11 like - 0 dislike
305 views

Quantum field theory is a broad subject and has the reputation of using methods which are mathematically desiring. For example working with and subtracting infinities or the use of path integrals, which in general have no mathematical meaning (at least not yet) ect. My question is a little vague, but i am interested in hearing what is the status of rigor in QFT. What is known to be mathematically rigorous and consistent, what is known to be not rigorous? Any examples and references are welcome.

Added: Just to clarify by rigorous I meant anything that a mathematician would find satisfactory. Also my question wasn't for books with rigorous (in some sense) approach, although that was welcomed. It was about specific examples of what is considered mathematically satisfactory and what not. For example the quantization of free fields satisfying the Klein-Gordon equation can be done rigorously. There is no mathematical definition in general of the Feynman path integral and so on.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user MBN
Given that path integrals are a system for manipulating symbols that follow well defined rules, it would seem that "mathematical meaning" means "thought up by a mathematician"? Or am I just being catty?

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user dmckee
I don't know. Is that the case? Are the rules well defined? In what I have looked at, there was always talk about a measure, which does not exists they way they wanted it.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user MBN
@Moshe: one definition would be that you can (in principle) let a computer program prove your theorem. Which can indeed be done for rigorous mathematical statements and it is believed that most of the proofs can be converted into that form. On the contrary, what physicists do is often just hand-waving, far from mathematical standards. In my opinion, rigor is pretty well-defined and also quite ignored in physics :)

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Marek
That's an impossible standard, even for mathematics, definitely in practice but also in principle (have you heard of incompleteness theorems?). Anyhow, I stand by my prediction that there will be a wide range of opinions what this word really means.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user user566
@Moshe: what does incompleteness have to do with any of this? I did not say that everything can be proved. I just said that everything that mathematicians care about and believe that has been proved in some papers can indeed be proved from the lowest level (axioms + logical inference). Anyhow, no mathematician would agree with you. Rigor means that at least in principle the statement is absolutely 100% correct (and can be proved so). If there is even a slightest chance of loophole then it's not rigorous by any means. In physics it's just these 99% likelyhood statements most of the time.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Marek
but you could turn this post into an answer by quoting part of the accepted answer there

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Tobias Kienzler
@kakaz @Tobias Stupid or not, mods are here to fix it (=

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user mbq

+ 9 like - 0 dislike

working with and subtracting infinities ... which in general have no mathematical meaning

is not really correct, and seems to have a common misunderstanding in it. The technical difficulties from QFT do not come from infinities. In fact, ideas basically equivalent to renormalization and regularization have been used since the beginning of math--see, e.g., many papers by Cauchy, Euler, Riemann, etc. In fact, G.H. Hardy has a book published on the topic of divergent series:

http://www.amazon.com/Divergent-AMS-Chelsea-Publishing-Hardy/dp/0821826492

There is even a whole branch of math called "integration theory" (of which things like Lebesgue integration is a subset) that generalizes these types of issues. So having infinities show up is not an issue at all, in a sense, they show up out of convenience.

So the idea that infinities have anything to do with making QFT axiomatic is not correct.

The real issue, from a more formal point of view, is that you "want" to construct QFTs via some kind of path integral. But the path integral, formally (i.e., to mathematicians) is an integral (in the general sense that appears in topics like "integration theory") over a pretty pathological looking infinite dimensional LCSC function space.

Trying to define a reasonable measure on an infinite dimensional function space is problematic (and the general properties of these spaces doesn't seem to be particularly well understood). You run into problems like having all reasonable sets being "too small" to have a measure, worrying about measures of pathological sets, and worrying about what properties your measure should have, worrying if the "$\mathcal{D}\phi$" term is even a measure at all, etc...

At best, trying to fix this problem, you'd run into an issue like you have in the Lebesgue integral's definition, where it defines the integral and you construct some mathematically interesting properties, but most of its utility is in letting you abuse the Riemann integral in the way you wanted to. Actually calculating integrals from the definition of the Lebesgue integral is not generally easy. This isn't really enough to attract the attention of too many physicists, since we already have a definition that works, and knowing all of its formal properties would be nice, and would certainly tell us some surprising things, but it's not clear that it would be all that useful generally.

From an algebraic point of view, I believe you run into trouble with trying to define divergent products of operators that depend on renormalization scheme, so you need to have some family of $C^*$-algebras that respects renormalization group flow in the right way, but it doesn't seem like people have tried to do this in a reasonable way.

From a physics point of view, we don't care about any of this, because we can talk about renormalization, and demand that our answers have "physically reasonable" properties. You can do this mathematically, too, but the mathematicians are not interested in getting a reasonable answer; what they want is a set of "reasonable axioms" that the reasonable answers follow from, so they're doomed to run into technical difficulties like I mentioned above.

Formally, though, one can define non-interacting QFTs, and quantum mechanical path integrals. It's probably the case that formally defining a QFT is within the reach of what we could do if we really wanted, but it's just not a compelling topic to the people who understand how renormalization fixes the solutions to physically reasonable ones (physicists), and the formal aspects aren't well-understood enough that it's something one could get the formalism for "for free."

So my impression is that neither physicists or mathematicians generally care enough to work together to solve this problem, and it won't be solved until it can be done "for free" as a consequence of understanding other stuff.

Edit:

I should also add briefly that CFTs and SCFTs are mathematically much more carefully defined, and so a reasonable alternative to the classic ideas I mentioned above might be to start with a SCFT, and define a general field theory as some kind of "small" modification of it, done in such a way to keep just the right things well-defined.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Mr X
answered Mar 9, 2011 by (200 points)
I've got Hardy's book and quote it against what you've said. (I just don't have it with me). Hardy was a good mathematician and knew that how you choose to "regularize" a divergent series drastically effects the resulting sum. The reason QFT gets away with it is that there's an underlying assumption that the functions involved are complex and analytic.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Carl Brannen
Yes, that's part of what I meant by saying we want to fix our answers against "physically reasonable" solutions. Although complex analytic is actually too strong an analyticity property in general for us, and you do need some extra technical assumptions to make sure things are "physically reasonable." But worrying about properties in terms of analyticity is problematic from the infinite dimensional POV (think about the topological and measure theoretic properties of analytic subsets of these infinite dimensional LCSC spaces).

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Mr X
Also, the space of paths you integrate over are the Brownian motion like ones, which aren't differentiable anywhere. But you still run into problems because other spaces than the obvious one are pathological ;). I believe you can approach ODEs and PDEs from this point of view (I don't know if much has been done with this because it's a pretty perverse thing to do), but thinking about them brings up a whole host of problems that are only worse in this case from an analytic POV.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Mr X
very good summary; although i have to say that i find extremely sad and discouraging when i hear bright physicists say stuff like "From a physics point of view, we don't care about any of this, because we can talk about renormalization, and demand that our answers have physically reasonable properties...but the mathematicians are not interested in getting a reasonable answer". This might be right from a numerical (maybe numerological?) perspective, but its the completely wrong mindset to begin with. Mathematical consistency (or a clear pathway to it) is never a luxury. Avoiding it is

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user lurscher
Although, speaking as a mathematician, I feel I must correct you: the proper phrase is measure theory not "integration theory".

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Alex Nelson
+ 7 like - 0 dislike

First: There is no rigorous construction of the standard model, rigorous in the sense of mathematics (and no, there is not much ambivalence about the meaning of rigor in mathematics).

That's a lot of references that Daniel cited, I'll try to classify them a little bit :-)

Axiomatic (synonymous: local or algebraic) QFT tries to formulate axioms for the Heisenberg viewpoint (states are static, observables are dynamic). There are three sets of axioms known:

Roughly, the Wightman axioms describe how fields relate to observables, the Osterwalder-Schrader axioms are the Wightman axioms for Euclidean field theory, and the Haag-Kastler axioms dodge fields entirely and describe the observables per se. All three sets of axioms are roughly equivalent, meaning that the equivalence has been proven, sometimes with additional assumptions that physicists deem to be irrelevant.

"PCT, Spin and Statistics, and All That" was the first introduction to the Wightman axioms.

"Local Quantum Physics: Fields, Particles, Algebras" is an introduction to the Haag-Kastler axioms, as is "Mathematical Theory of Quantum Fields".

"Perturbative Quantum Electrodynamics and Axiomatic Field Theory" is a description of QED from the viewpoint of the Haag-Kastler axioms.

"Introduction to Algebraic and Constructive Quantum Field Theory" is about the quantization of given classical equations in the spirit of Haag-Kastler.

"Quantum Physics: A Functional Integral Point of View" uses the Osterwalder-Schrader axioms.

2D conformal field theory can be axiomatized using the Osterwalder-Schrader axioms, for example.

Functorial quantum field theory axiomatizes the Schrödinger viewpoint, see e.g. hnLab on FQFT.

This includes for example topological quantum field theories, these describe essentially theories with finite degrees of freedom. This branch has had a lot of impact in mathematics, especially with regard to differential geometry, and here to the theory of 3D and 4D smooth manifolds. I'd put

Daniel S. Freed (Author), Karen K. Uhlenbeck: "Geometry and Quantum Field Theory"

in this category.

"Geometry and Quantum Field Theory"

Quantization of classical field theories: Note that the axiomatic approaches don't depend on classical field theories that need to be quantized, they open the doors for a direct construction of quantum systems without classical mirror. The Lagrangian approach to QFT is an example of an ansatz that starts with a classical field theory that needs to be quantized, for which different means can be used.

Ticciati: "Quantum Field Theory for Mathematicians" is actually a quite canonical introduction to Lagrangian QFT, without much ado.

There is a lot of material about the geometry of classical field theories and variants to quantize them, like "geometric quantization".

The book Welington de Melo, Edson de Faria: "Mathematical Aspects of Quantum Field Theory" is an example of this.

Much more advanced is "Quantum Fields and Strings: A Course for Mathematicians (2 vols)"

For the path integral there are two points of view:

• The path integral - along with the Feynman rules - is a book keeping device for a game called renormalization, that lets you calculate numbers according to arcane rules,

• the path integral is a mathematical construct like a "measure" - but not a measure in the sense of measure theory known today - that needs to be discovered and defined appropriately.

AFAIK there has not been much progress with the second viewpoint, but there are people working on it, for example the authors of the book "Mathematical Theory of Feynman Path Integrals: An Introduction". You can find a lot more material about the mathematical theory of path integrals on the nLab here.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Tim van Beek
answered Mar 9, 2011 by (745 points)
I thought the Osterwalder-Schrader axioms were describing the Euclidean path integral approach...not the Heisenberg picture. Also, there are some ambiguities with quantizing a classical field (even in quantum mechanics, there is ambiguities in the quantization procedure; see, e.g., the Groenewald-van Hove "no-go" theorem).

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Alex Nelson
+ 7 like - 0 dislike

Here is my answer from a condensed matter physics point of view:

Quantum field theory is a theory that describes the critical point (massless QFT) and the neighbor of the critical point (massive QFT) of a lattice model. (Lattice models do have a rigorous definition).

So to rigorously define/classify quantum field theories is to classify all the possible critical points of lattice models, which is a very important and very hard project.

(One may replace "lattice model" in the above by "non-perturbatively regulated model")

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Xiao-Gang Wen

answered May 30, 2012 by (3,475 points)
edited Apr 5, 2014
Thanks, can you point out a general exposition/overview article about lattice models and QFT. Or any sourse that can give me an idea.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user MBN
This is the same answer as physics.stackexchange.com/questions/4068/…

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user mbq
+ 6 like - 0 dislike

There are several books that approach QFT (and/or Gauge Theory) from different levels of 'mathematical rigor' (for some definition of "mathematical rigor" — that Moshe would approve ;-).

So, let me give you some sort of 'preliminary list'… it's by no means complete, and it's in no particular order either, but i think it may pave the way to further work.

In any case… there's much more out there, not only in terms of topics (renormalization, etc) but also in terms of articles, books and so on.

So, there's plenty of "mathematical rigor" in QFT (and String Theory, for that matter), including different 'levels' of it, that should please and suit varied tastes.

PS: There are other topics here that deal with this topic in a form or another, e.g., Haag's Theorem and Practical QFT Computations. So, don't be shy and take a look around. :-)

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Daniel
answered Mar 9, 2011 by (685 points)
There is not "plenty" of mathematical rigor, as the rigorous work is utter crap, barely capable of repeating the stuff that was current in the 1950s. One central problem is that mathematicians are stupid with regard to defining measures, so that the field of "measure theory" is wrong. They need to reaxiomatize the field to make every set of reals measurable before they can do path integrals, and they won't do that, so tough luck.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Ron Maimon
+ 5 like - 0 dislike

I'll give a reference that I didn't (yet) managed to finish myself. But it looks really rigorous:

N. N. Bogoliubov, A. A. Logunov, A. I. Oksak, I. T. Todorov (1990): General Principles of Quantum Field Theory. (ISBN 0-7923-0540-X. ISBN 978-0-7923-0540-8.)

It's what I call "Bogoliubov approach" to QFT.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Kostya
answered Mar 8, 2011 by (310 points)
I'll take a look. I've heard about it, and it is the one approach I have never looked at. Not that I have read much about the others, I have a very superficial knowledge of the subject.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user MBN
This is not rigorous at all. I read it. Nothing by physicists is rigorous except for Glimm and Jaffe.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Ron Maimon
+ 1 like - 0 dislike

I would like to point out that there are several different problems coming from different points of view on the subject. It would be very complicated to comment on all of them, so let me restrict to a particular one.

As a first remark, I have to state that nobody that works in mathematics may have a doubt about what "rigorous" means. I will not comment on this since it seems that it was already explained in a clear manner.

Concerning your question, I would like to state that QFT is not a "unique" theory, but a bunch of several different ones which are more less related to each other due to some intrinsic descriptions. For instance, the "behavior" and construction of the (real or complex) scalar field theory and of the gauge theory is rather different. This is a kind of natural consequence of the fact that Classical Field Theory (ClFT) (which is completely rigorous up to some extent, even though it still contains several nontrivial problems) is also a collection of several different theories, which share a general geometrical description, but which have their own particular difficulties: as a particular setting of ClFT we may obtain classical mechanics, electromagnetism or even nonabelian gauge theory, etc. Let me also add that the general philosophy underlying ClFT appears, in some sense, as the only manner to construct relativistic extensions of the free situation, as a major difference with classical mechanics, in which you may add any constraint to a free particle without breaking any fundamental principle of the theory. I'm only rephrasing what P. Deligne and D. Freed state in the first volume of "QFT and Strings for Mathematicians", which was already mentioned.

Concerning now the problem of the quantization of each of the particular settings you may consider in ClFT, there are several problems to deal with. Let me consider two different aspects of the problem: perturbative and nonperturbative QFT. We may say that the former is (morally) a shadow of the latter. Moreover, the perturbative QFT (pQFT) can be developed in a mathematically rigorous manner in lots of situations. You may see the article by R. Borcherds in the arXiv "Renormalization and quantum field theory" (even though some of the ideas were already present in other texts in the literature, and, in my opinion, they are lurking behind some of the constructions and proofs by the author, see for instance the articles by O. Steinmann, which were also considered by R. Brunetti, K. Fredenhagen, etc). In this situation he defines in a rigorous manner an object which behaves like the Feynmann measure ("via the Riesz theorem"), and he gives a very complete account of how the pQFT should be described in several situations. The problem stays however in giving a correct formulation of nonperturbative QFT. This is a major problem, and only a few rigorous constructions up to dimension 2 (also dimension 3, but really few as far as I know. It would nice to hear the experts in this point) were performed. You may see the book by J. Glimm and A. Jaffe "Quantum physics – a functional integral point of view". In fact the major problem comes when trying to quantize the gauge theory, as a sub collection of situations of QFT. The lack of such a general picture means in fact that we actually do not know what a Quantum Field Gauge Theory really looks like (or just is, if you want). In particular (I state this because some people argue that the following is a consequence of having only a perturbative description), two major claims of physicists about the standard model (which are in some sense related), the mass gap and the quark confinement, are not proved (the former in fact constitutes one of the Millennium Prize Problems). Needless is to say that none of the physical heuristic arguments are clearly sufficient.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Estanislao
answered Jun 28, 2012 by (10 points)
Mathematicians are very silly when it comes to "rigorous" regarding measure theory and this is why they are stuck. The problem starts when you have to axiomatize measure theory to define random picks. There should be no hard work involved in defining a constructive measure (a picking you can do on a computer, or a limit thereof), but there is.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Ron Maimon
What I mean by that is the following: "A free quantum field theory: consider picking every fourier transform value f(k) of a random function to be a Gaussian with a (specific) variance $\sigma(k)$. This is the (imaginary time) quantum field." Did I just define free quantum fields? Not for mathematicians, because a random picking algorithm, no matter how convergent, does not define a measure. You need a sigma algebra to define a measure. You can't say "the measure of a set is the probability that this random function lands in the set" because this only makes sense in a Solovay universe.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Ron Maimon
I'm not sure I understand what you try to say, because in my limited experience the Solovay thm (further extended by Krivine, Shelah, etc) is just a manner of stating that the construction of nonmeasurable Lebesgue sets depends on the axiom of choice. All these results are far from being simple in my opinion. In any case, this discussion seems to me to be somehow misleading because which measure-like objects are needed is not completely hidden: measures are in some sense too restrictive, and prodistributions seem much more well-adapted objects, as studied by P. Cartier and C. De Witt-Morette.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Estanislao
(2nd part) In fact, both authors have showed that the setting of prodistributions (generalizing the restricted situation given by measures or even promeasures) gives the physically desired explanations if we are working with the rather restrictive (but already interesting) situation of paths. I would like to stress however what I think is the main nontrivial problem: even though a measure-like theoretic formulation of npQFT may be given, it must still give an answer to the mass gap problem or the quark confinement, which seems to be (really much) more than a straightfoward computation.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Estanislao
I completely agree that the nontrivial problem is proving the properties of the measure, like mass gap, and that what I am saying is focusing attention on something more primitive, and so might be misleading from the main point. But I am sure that any technique for proving mass-gap is one which shows the Euclidean theory is decaying correlation functions, and this is something like a probability coupling which relates the probability distribution on the fields on one regularization (lattice) to those in a coarser regularization (lattice), and takes the limit (renormalization).

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Ron Maimon
The idea of the proof is simply that there is a logically consistent way of defining a new real at random (so not in any current countable model), and this real gives measure to everything previous, just by the probability of landing in it (this much was already known to Cohen). But adding a new real adjoins lots of new sets, but you can choose a second random real, and again everything gets a measure, but you adjoin new sets, etc. The point is that you can consistently end this process, something which is intuitively obvious, because the probabilistic picking is obviously consistent.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Ron Maimon
Once you know probability is consistent, so that you can choose things at random without contradiction, you can do probability on any set, even a set of distributions, just by defining an algorithm which picks distributions at random. Physicists use this implicitly all the time, for constructing the Ising model on infinite lattices (for example) something which is not obvious in mathematics, because you need a ridiculous sigma algebra construction the moment the lattice is infinite. The baggage of measure theory is onerous, it blocks you from making intuitive arguments about field theory.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Ron Maimon
+ 0 like - 0 dislike

QFT's reputation for using methods which are mathematically unsound isn't really deserved these days. Certainly, not everything is under perfect analytic control, but the situation isn't really much worse than it is in fluid dynamics.

In particular, the 'subtraction of infinities' thing isn't really considered to be an issue anymore. The mathematicians who've looked at it recently (like Borcherds & Costello) have basically come to the conclusion that Wilsonian effective field theory resolves these difficulty. You can make all computations solely in terms of long-distance 'effective' quantities, which are the things left behind when physicists subtract infinities. Short distance infinities therefore don't present a problem for defining correlation functions; there's nothing inconsistent about the basic path integral formalism.

This is really the same conclusion the constructive field theorists came to, studying lower dimensional examples in the 70s & 80s.

The challenge in rigorous QFT is dealing with infrared divergences. If your spacetime has infinite volume, then your field system can have degrees of freedom of arbitrarily large size. Coupling to these degrees of freedom can give you infinities. There are real mathematical problems here, but they're more like describing the solutions of an equation than describing the equation itself. (Really non-trivial things can happen. In QCD, for example, there is confinement: many of the observables you'd naively expect to be integrable with respect to the path integral measure -- like the observable representing a free quark or a free gluon -- aren't. Instead, the integrable observables are complicated mixtures of quarks and gluon, like protons, neutrons, and glueballs.) Most of the heavy lifting in Glimm & Jaffe, for example, comes not from constructing the 2d $\phi^4$ path integral measure, but from proving that its $n$-point correlation functions actually exist.

Naturally, this means that most computations of observable expectation values -- like in lattice gauge theory -- are not under tight analytic control. Convergence in simulation is mostly a matter of good judgement, for now.

Saying anything rigorously about this stuff almost certainly will require mathematicians to get a better grip on renormalization in non-perturbative settings (i.e., on the lattice). There are a good number of mathematicians actively working on this stuff. Geometers and topologists are getting more sophisticated about topological field theory, while the analysts have taken up statistical field theory.

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user user1504
answered Jun 28, 2012 by (1,110 points)
+ 0 like - 1 dislike

Absolute lack of rigor in this aspect :
Integration over an infinite volume or infinite time ?
It could be correct math but it is physically incorrect.
Einstein, relativity, light cone... should bring some light to this issue.
For instance, the computation of the lagrangian of a single particle, immobile, is something along this: Kinetic energy = 0, Potential Energy = time growing value because the field can not be setted in ALL the space at once (since the begining/borning of the particle the field/energy, able to produce work, is spreading away at 'c' speed).
We make a blind approximation when we integrate over infinite because in this way the lagrangian appears constant, not being so, and this is wrong because one should always respect the limits of integration.
The make the total energy a constant value we have to put the internal energy of the particle into the lagrangian and this energy must decrease thru time as much as the potential energy grows thru time. By this reasoning one must conclude that the matter contents of the particles are always decreasing. We are not able to measure this effect in the lab because... you guess, the lab is shrinking also.
The belief that the space is expanding (without probable cause) is widespread for ages. The alternative, matter shrinking, offers a probable cause and an explanation for the measured 'space expansion', and the 'Dark Energy' becames 'nothing' at all, idem to the beloved 'cosmological constant', inflation period,...
The cosmological redshift is caused by bigger atoms of the past and are not rushing away.
May be Relativity is closer to QM than we suspected.
A self-similar model of the Universe unveils the nature of dark energy formally explains this shift of paradigm (with my modest contribution).

This post imported from StackExchange Physics at 2014-04-04 15:41 (UCT), posted by SE-user Helder Velez
answered Jun 29, 2012 by (-130 points)

 Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead. To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL. Please consult the FAQ for as to how to format your post. This is the answer box; if you want to write a comment instead, please use the 'add comment' button. Live preview (may slow down editor)   Preview Your name to display (optional): Email me at this address if my answer is selected or commented on: Privacy: Your email address will only be used for sending these notifications. Anti-spam verification: If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:p$\hbar\varnothing$sicsOverflowThen drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds). To avoid this verification in future, please log in or register.