# Quantum Fields From Cluster-Decomposition Principle

+ 4 like - 0 dislike
1878 views

I would like help proving Weinberg's claim (I've quoted him below) that quantum fields are an unavoidable consequence of merging particle-based quantum mechanics with both Lorentz invariance and the cluster-decomposition principle. I'm hoping it can be done directly using a (time-ordered) Dyson series solution of the Schrodinger equation:

$$\Psi(t,x) = Te^{\frac{-i}{\hbar}\int_{-\infty}^{\infty}H(t')dt'}\Psi_0 = T\sum_{n=0}^\infty \frac{(-i / \hbar)^n}{n!}(\int_{-\infty}^{\infty}H(t')dt')^n \Psi_0$$

The method is going to be to expand the operator $H$ into a sum of creation and annihilation operators, which Weinberg says (below) actually involves no physics and is just a mathematical idea that applies to all operators. Assuming you are comfortable with this, which I am not, there should come a moment when inserting a delta function is unavoidable. Could someone help me with this?

To Motivate the Question:

As a hint that this should be possible, I came across Brown's QFT book where he says that, assuming quantum field theory has been developed, the cluster-decomposition principle is just the factoring $$Z(p_1 + p_2) = Z(p_1)Z(p_2)$$ of probability amplitudes, where $p_1(x)$ and $p_2$ are source functions by which he means $$Z(p) = \int [d \psi] \exp(\tfrac{i}{\hbar}\int d^n x ( \mathcal{L} + p_o \psi))$$

(I don't know why he writes $p_0$ and not $p$, and I do not see how doing something like $p_0 = p_1' + p_2'$ allows for this factorization anyway).

Something about the decomposition should naturally motivate inserting delta functions into that $H$ as a way to ensure the $Z(p_1 + p_2) = Z(p_1)Z(p_2)$ right, and should imply quantum fields, but I do not really see how playing with the $H$ in $\exp (\tfrac{i}{\hbar}\int Hdt)\psi_0$ will lead to anything like $Z(p_1 + p_2) = Z(p_1)Z(p_2)$. Nothing in the derivation of Schrodinger's equation or this Dyson series solution actually tells us whether we are working with particles or fields, which is encouraging. Could someone help me with this?

Origin of the question:

In Weinberg's essay, "What is Quantum Field Theory, and What Did We Think It Is?", he talks about the rationale for QFT:

In the course of teaching quantum field theory, I developed a rationale for it, which very briefly is that it is the only way of satisfying the principles of Lorentz invariance plus quantum mechanics plus one other principle. Let me run through this argument very rapidly. The first point is to start with Wigner’s definition of physical multi-particle states as representations of the inhomogeneous Lorentz group.9 You then define annihilation and creation operators $a(\vec{p}, σ, n)$ and $a^†(\vec{p}, σ, n)$ that act on these states. There’s no physics in introducing such operators, for it is easy to see that any operator whatever can be expressed as a functional of them. The existence of a Hamiltonian follows from time-translation invariance, and much of physics is described by the S-matrix... This should all be familiar. The other principle that has to be added is the cluster decomposition principle, which requires that distant experiments give uncorrelated results.10 In order to have cluster decomposition, the Hamiltonian is written not just as any functional of creation and annihilation operators, but as a power series in these operators with coefficients that (aside from a single momentum-conservation delta function) are sufficiently smooth functions of the momenta carried by the operators.

An interesting consequence:

Is the Lagrangian formulation just a mathematical construct allowing one to actually enact the decomposition of $H$ into creation and annihilation operators and this weird delta function, and nothing more? In other words, there may be other ways to do the same thing.

References:

• Weinberg, Quantum Theory of Fields, Vol. 1, Ch. 4.
• Brown, Quantum Field Theory, Ch. 6

This post imported from StackExchange Physics at 2015-03-03 15:13 (UTC), posted by SE-user bolbteppa

asked Feb 26, 2015
edited Mar 3, 2015

## 1 Answer

+ 3 like - 0 dislike

First of all, a formula of the form $Z(p_1+p_2)=Z(p_1)Z(p_2)$ is invalid in general. The correct formula resembling this is

$\lim_{y\to\infty} Z(J_1+\tau_y J_2)=Z(J_1)Z(J_2),$

where $\tau_y$ is space-time translation by $y$.

Second, contrary to what Weinberg says in his book, it is possible to get covariance and cluster decomposition without a field-based approach. There is a big survey by Keister and Polyzou on the subject:

B.D. Keister and W.N. Polyzou,
Relativistic Hamiltonian Dynamics in Nuclear and Particle Physics,
in: Advances in Nuclear Physics, Volume 20,
(J. W. Negele and E.W. Vogt, eds.)
Plenum Press 1991.

(see also the entry ''Is there a multiparticle relativistic quantum mechanics?'' from Chapter B1 of my theoretical physics FAQ.)

This means that any argument you'd like to propose would imply that the multiparticle approach given in the above reference is equivalent to a field theory. This would be a quite nontrivial and interesting result. Good luck!

answered Mar 3, 2015 by (14,009 points)
edited Mar 3, 2015
Most voted comments show all comments

@JiaYiyang: on p.169 of his Vol.1, end of the second paragraph. He asserts this with the justification that ''such efforts have always run into troubles'', which was true in early times but , as my answer shows, no longer when he published his book.

corrected; thanks!

@JiaYiyang: You may wish to add this to your list of errata to Weinberg's books.

@Arnold, I planned to do so the moment I saw your answer! But I need to do some minimal reading on the references you gave, perhaps tomorrow.

I just took a look at Keister and Polyzou, I'm a bit confused by their definition of cluster decomposition(or as they call cluster separability/macroscopic causality)

It assumes that observables associated with regions of spacetime that have a sufficiently large (as opposed to arbitrarily small) spacelike separation commute--------------page 17

How is this not a consequence of microscopic causality?

Most recent comments show all comments

This is difficult to tell as only asymptotic matrix elements survive in the conventional approach and the fields disappear (apart from the free fields used in Wick's theorem). At the present stage of knowledge, the renormalization limit is mathematically too ill-defined to say what the renormalized field is - but it is known that renormalized fields and bare fields live in different Hilbert spaces, so expressing one in terms of the other is strictly speaking impossible. Maybe upon further reflection I can find a plausible story to tell that is not too false; but this may take time....

Now it seems to me that the renormalized field should indeed be local; this is the assumption made in Wightman's system of axioms. (Though it is not clear whether 4D theories satisfy this, it is verified in various lower-dimensional theories.) Thus the conflict must be somewhere else. Need to think more....

## Your answer

 Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead. To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL. Please consult the FAQ for as to how to format your post. This is the answer box; if you want to write a comment instead, please use the 'add comment' button. Live preview (may slow down editor)   Preview Your name to display (optional): Email me at this address if my answer is selected or commented on: Privacy: Your email address will only be used for sending these notifications. Anti-spam verification: If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:p$\hbar$ysicsOver$\varnothing$lowThen drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds). To avoid this verification in future, please log in or register.