Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  Why isn't the path integral rigorous?

+ 8 like - 0 dislike
15032 views

I've recently been reading Path Integrals and Quantum Processes by Mark Swanson; it's an excellent and pedagogical introduction to the Path Integral formulation. He derives the path integral and shows it to be: $$\int_{q_a}^{q_b} \mathcal{D}p\mathcal{D}q\exp\{\frac{i}{\hbar}\int_{t_a}^{t_b} \mathcal{L}(p, q)\}$$

This is clear to me. He then likens it to a discrete sum $$\sum_\limits{\text{paths}}\exp\left(\frac{iS}{\hbar}\right)$$ where $S$ is the action functional of a particular path.

Now, this is where I get confused. He claims that, because some of these paths are discontinuous or non-differentiable and that these "un-mathematical"1 paths cannot be disregarded, the sum is not mathematically rigorous, and, thus, that the transition amplitude described by the path integral is not rigorous either. Please correct me if I am incorrect here.

Furthermore, he claims that this can be alleviated through the development of a suitable measure. There are two things that I don't understand about this. First, why isn't the integral rigorous? Though some of the paths might be difficult to handle mathematically, they aren't explicitly mentioned at all in the integral. Why isn't the answer that it spits out rigorous? And, second, why would a measure fix this problem?


1 Note: this is not the term he uses

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user jimmy
asked Mar 11, 2015 in Theoretical Physics by jimmy (40 points) [ no revision ]
Short answer: To define an integral rigorously, it's not enough to just say "and now take the limit $N \to \infty$". You need to prove that your discrete sum converges to something, and that it doesn't matter how you take the limit.

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user Javier
@Javier Badia does this have to do with non-differentiable paths or is it a separate issue?

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user jimmy
Can't one make everything work with proper regularization, and doesn't this regularization allow all the cases actually relevant to physics (as opposed to all possible mathematical corner cases)?

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user DanielSank
It's 'excellent and pedagogical' compared to what other presentation?

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user NikolajK
It is indeed folklore that path integral is not rigorous mathematically, or more precisely, the rigorous maths has not yet been rigorously developed. This is typical in physics. But the real problem is that, many people do not know they are doing handing waving when they are doing it.

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user Jiang-min Zhang
@NikolajK just in general. it is my first introduction to path integrals, but I am finding that the book is not too difficult to follow

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user jimmy

See also this discussion.

3 Answers

+ 6 like - 0 dislike

There are several points:

  • The first is that for usual self-adjoint Hamiltonians of the form $H=-\Delta +V(x)$, with a common densely defined domain (and I am being very pedantic here mathematically, you may just ignore that remark) the limit process is well defined and it gives a meaning to the formal expression

    $\int_{q_a}^{q_b} \mathcal{D}p\mathcal{D}q\exp\{\frac{i}{\hbar}\int_{t_a}^{t_b} \mathcal{L}(p, q)\}$

    by means of trotter product formula and the corresponding limit of discrete sums. So the object has most of the time meaning, as long as we see it as a limit. Nevertheless, it would be suitable to give a more direct mathematical interpretation as a true integral on paths. This would allow for generalizations and flexibility in its utilization.

  • It turns out that a suitable notion of measure on the space of paths can be given, using stochastic processes such as brownian motion (there is a whole branch of probability theory that deals with such stochastic integration, called Itô integral). To relate this notion with our situation at hand there is however a necessary modification to make: the factor $-it$ in the quantum evolution has to be replaced by $-\tau$ (i.e. it is necessary to pass to "imaginary time"). This enables to single out the correct gaussian factors that come now from the free part of the Hamiltonian, and to recognize the correct Wiener measure on the space of paths. On a mathematical standpoint, the rotation back to real time is possible only in few special situations, nevertheless this procedure gives a satisfying way to mathematically define euclidean time path integrals of quantum mechanics and field theory (at least the free ones, and also in some interacting case). There are recent works of very renowned mathematicians on this context, most notably the work of the fields medal Martin Hairer (see e.g this paper and this one, or the recent work by A. Jaffe that gives an interesting overview; a more physical approach is given by Lorinczi, Gubinelli and Hiroshima among others).

  • The precise mathematical formulation of path integral in QM is called Feynman-Kac formula, and the precise statement is the following:

    Let $V$ be a real-valued function in $L^2(\mathbb{R}^3)+L^\infty(\mathbb{R}^3)$, $H=H_0+V$ where $H_0=-\Delta$ (the Laplacian). Then for any $f\in L^2(\mathbb{R}^3)$, for any $t\geq 0$: $$(e^{-tH}f)(x)=\int_\Omega f(\omega(t))e^{-\int_0^t V(\omega(s))ds}d\mu_x(\omega)\; ;$$ where $\Omega$ is the set of paths (with suitable endpoints, I don't want to give a rigorous definition), and $\mu_x$ is the corresponding Wiener measure w.r.t. $x\in\mathbb{R}^3$.

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user yuggib
answered Mar 11, 2015 by yuggib (360 points) [ no revision ]
thanks, great answer

This post imported from StackExchange Physics at 2015-05-13 18:56 (UTC), posted by SE-user jimmy
+ 1 like - 0 dislike

The Lagrangian involves derivatives but the differentiable functions have measure zero in any useful definition of a measure on a function space (for example the Wiener measure), so they would integrate to zero. This makes the introduction of the approximate path integral mathematically dubious.

The second dubious point is that the sum is highly oscillating and not absolutely convergent. Hence unlike for a Riemann integral, the sum doesn't have a sensible limiting value as you remove the discretization. The sum depends on how one orders the pieces and (as is well-known for many divergent alternating series) can take any desired value depending how you arrange the terms. Note that there is no mathematically natural ordering on the set of paths. Thus the recipe doesn't give a fixed result.

Therefore the informal introduction is only an illustration - an informal extrapolation of what can be made well-defined in finite dimensions (through treating $i$ as a variable and analytic continuation from real $i$ to $i=\sqrt{-1}$). In quantum field theory it is just used as a (very useful) formal tool to be used with caution since it may give wrong results, and only practice teaches what caution means.

Making the path integral well-defined in the setting discussed requires far more mathematical background and has been rigorously achieved only for quadratic Lagrangians in any dimension, and in dimension $<4$ for some special classes of nonquadratic ones. The mathematically simplest case is in $d=2$, where the construction of the measure (for real $i$), and the analytic continuation to $i=\sqrt{-1}$ are rigorously derived - it takes a whole book (Glimm and Jaffe) to prepare for it.

answered May 17, 2015 by Arnold Neumaier (15,787 points) [ revision history ]
edited May 17, 2015 by Arnold Neumaier
Most voted comments show all comments

The derivative term in the path integral $\dot{x}$ does not mean that the function $x(t)$ is differentiable. it means something else, namely $x(t+\epsilon)-x(t)\over \epsilon$ for the $\epsilon$ of your regularization. This quantity does not commute with $x(t)$ in the path integral, because $x(t+\epsilon) \dot{x} - x(t)\dot{x}$ is $(x(t+\epsilon)-x(t))^2\over \epsilon$ which is a fluctuating quantity which averages to 1 as a distribution in the standard quadartic path integrals.

This is not a problem in defining the path integral rigorously, because Feynman never expected that the typical $x(t)$ would be differentiable when writing down $\dot{x}$ in the path integral! He explicitly said it would be nondifferentiable, and he identified the non-differentiability as the path-integral origin of the Heisenberg commutation relation, whose imaginary time form I gave above.

There is never an expectation when you write a path integral that the differentiation operations are being applied to differentiable functions, when you are done with the continuum limit, the differentiation operators are applied in the distributional sense to distributions, the typical paths in the path integral are well defined as distributions in nearly all cases of interest. Further, the products of distributions in the path integral which have coinciding singularities are not interpreted as pure products either, these would not make sense, but rather as products in a regularization $\epsilon$, with subtractions which make them well defined in the limit of small $\epsilon$. The two ideas together, of operator products and distributional derivatives, give meaning to every term in the Lagrangian, without any a-priori assumptions on the character of the path, other than that it can be sampled by Monte-Carlo sampling (which is true numerically, and defines a probabilistic algorithm for making complete sense of the procedure).

The proper interpretation of the path integral is by a limiting procedure which introduces an $\epsilon$ and takes a limit at the end, i.e. a regularization and renormalization. This is also true in 0+1d quantum mechanics, as I tried to make clear using the noncommutativity of $\dot{x}$ and $x$. In 1d, there is no remaining problem with products in the path integral, other than this finite noncommutativity, and mathematicians have already defined a rigorous calculus, Ito calculus, for the imaginary time formulation of 0+1d. In field theory, there are also regularization issues with coinciding products, and the commutation relations expand to OPE relations, and you can't use the same calculus. This is also true for Levy field theory, where you replace the $\dot{x}$ term with the log of a Levy distribution between consecutive steps, to describe a particle making a Levy flight. This has a continuum limit also, but it is not described by Ito calculus, rather by a different calculus which has not been described by mathematicians.

The only issue with the path integral is how to take limits of statistical sampling for small lattices, or relaxing any other regulator. The limit is universal for small $\epsilon$, it doesn't depend on the discretization, just as the limits of discrete difference equations are universal in the limit of small $\epsilon$ and give you the differential equations of calculus. Even in calculus, it is difficult to prove convergence of arbitrary discretizations, the standard proofs of existence/uniqueness of differential equations iterate the integral form of the equation and don't bound the convergence of Runge-Kutta schemes rigorously, as the iterated integral equation already takes a limit implicitly inside the integral, and the convergence proof is made easier. You can't use such tricks in quantum field theory, except when you formulate the theory as an SDE (which is what Hairer does, and he iterates an integral equation to find his solutions).

The formal mathematics for describing the limits of measures to continuum measures is complicated by the difficulties mathematicians have for making a natural measure-friendly set theory, as current set theory makes any discussion of measure a minor nightmare, because you have to constantly keep in mind which sets are measurable and which are not when using all the classical theorems. This is not acceptable, and the better solution is to work in a universe where every subset of [0,1] is automatically measurable, and keep in mind that there are exceptions to the Hahn-Banach theorem, to the basis-existence theorem, the prime ideal theorem, in cases where the is an uncountable choice required in intermediate steps. Since all the actual mathematical applications require only countably many consecutive choices, the exceptions are isolated from mathematical practice by a brick wall, measure paradoxes hardly ever bother probabilists. When they have a construction which they prove is true for all real numbers, they will lift it to "the values" of random variables without worrying.

@RonMaimon The content of my previous comment was rather different from what you say. You are thinking about very elaborated constructions, that may be interesting nonetheless, I am talking about a very simple observation.

The Solovay model and the ZFC one are, concerning some assertions, in contradiction; in addition, some theorems of ZFC are not provable (or maybe false) in the Solovay model. This is a fact, and cannot be disputed. Another fact is that physical descriptions of the world do not use only probability theory and path integrals, but also very different mathematical constructions (and there is an equivalent, from a physicist point of view, formulation of QM and QFT that do not use path integrals so one may argue if that point of view is an unavoidable physical requirement, or just a convenient tool). I strongly believe that it would be very desirable that one could describe the physical world by means of a single coherent mathematical model, and not with two (or many, or infinite) that are in contradiction; and one has to choose the appropriate one each time to prove a result (in a setting where in the meantime other results are false).

Now, if you believe that the correct model should have all sets of reals measurable and so on, it is your choice and of course you have to be able to develop your theory in a consistent fashion, and provide the necessary results to describe the "world". Mathematicians do that using ZFC, and so are in contradiction with you, but not with themselves. And I am sure that many results of the probability theory that is done by mathematicians around the world using ZFC is correct, and takes into account the difficulties implied by the axiom of choice.

And if you prove a theorem in a model, that theorem is true, and that is by the very definition of proving a theorem. Given ZFC, and the Solovay model (or any other model you want), is up to physicists to choose the one that has the theorems that are better suited for their purposes, but they have to choose one nonetheless, for having multiple theories to describe one world is logically contradictory (and for human ontology as you call it, unacceptable). So my suggestion is: arguing on philosophy is very interesting as a hobby (and I enjoy discussing with you), but if you would like that your point of view become a valid mathematical alternative you should make axioms and prove new theorems (within your framework, and that means you have to redo most of the proofs). If the proof is mathematically acceptable, those theorems by themselves would be of interest in mathematics. To make the physicists (mathematical physicists at least, that use mathematical rigor) shift from the model of ZFC to your model, you have to prove a huge amount of theorems, namely all those that they use in physical modelling. If else, they would stay with the well-established and very powerful ZFC, for many many results are already available there.

@ArnoldNeumaier: Okay, fair enough, that's how you feel, but you must remember that thousands of physicst didn't learn the ropes and they end up producing results of more value than the mathematicians that did. Those thousands of mathematicians are proving extremely primitive theorems compared to the state of the art knowledge in physics, and their methods are unreadable and arcane to anyone who doesn't learn their lore. I don't accept a situation where a good intuitive argument is used by thousands of people, is logically consistent within its own universe, but is not considered rigorous because of conventional universe. I insist that the convention must change to allow the intuitive argument to go through, and I will write whatever I need to make it happen if I can find it within my limited strength.

That's exactly why there are two distinct theoretical communities in physics - the theoretical physicists and the mathematical physicists. 

The physicists you mention produce valuable results at the level of theoretical physics but only conjectures at the level of mathematical physics. I have nothing against doing theoretical physics at a formal nonrigorous manner, but it shouldn't be called rigorous before it isn't.

Conventions are a matter of social agreement, hence involve many people, not just one who is discontent with the existing conventions. If you want conventions to change you must write enough papers and books that demonstrate to the satisfaction of others in your target community that what you do is fully rigorous on the basis of the Solovay model. As mentioned already, it will probably mean 1000 pages rather than 20 paragraphs of handwaving. It is only this sort of insistence that changes conventions (and even then only in the long run) - continually writing papers that drive home the point, in a convincing way.

@ArnoldNeumaier: I do not accept that mathematical physics can operate with the current framework of probability. You simply have no idea how simple probability becomes in a universe where R is measurable, and once you see it, even a little, you can't go back. It's like unlearning calculus and going back to Riemann sums. The constructions in measurable universe bear more of a relation to the arguments of theoretical physicists than to the arguments of mathematical physicists. It's the mathematical physicists who are doing things all wrong in this case, not the theoretical physicists.

Many mathematicians, perhaps a large minority, perhaps a majority, are already dissatisfied with the inability to speak intuitively about probability in ZFC. This is reflected in the set theoretic paradoxes I linked, which you haven't read--- these are all conflicts with intuitive probability (for example, here: https://cornellmath.wordpress.com/2007/09/13/the-axiom-of-choice-is-wrong/ , related articles are "Axiom of Choice and predicting the future", and a linked mathoverflow question regarding an analogous even more counterintuitive issue with mathematicians unjustifiably guessing one of countably many real numbers in boxes put in front of them). These probability paradoxes are the main intuition conflict that led to issues with choice. You can read grumbling about it all over the literature. Conway complains about non-measurable sets in his book on Real Analysis. Connes complains about the fiction of non-measurable sets in his book about noncommutative geometry. Lebesgue complained about them in the 1930s until he died, and Cohen's forcing is accompanied by clear intuition straight from intuitive probabality. Solovay's paper by itself was nearly enough to completely change the consensus in 1972, there was a (small) movement to do it in the 1970s, and it didn't gather momentum, and Solovay's paper did not run to 1000 pages, it was a dozen or two relatively difficult pages (but they've been internalized and digested now, so that they're easy).

Why didn't the "revolution" complete? Why haven't all the measure theory books been rewritten so that every set is measurable? I believe for two reasons--- first, Solovay politically sold out a little, and explained in his introduction that "of course, the axiom of choice is true", and then gave some pieties to support conventional wisdom. He was not seeking a clean break with set-theoretic measure theory, like Lebesgue, Connes, Conway, and everyone else who knows intuitive probability. He was motivated simply by the desire to find a nifty model of ZF+DC, not by the desire to give a better model of "mathematical reality".

The other reason is that Solovay's model is not exactly a model of naive probability either. Sure, every subset of R is measurable, and the foundations of real analysis and functional analysis go through, so you can pick countably many numbers in [0,1], pick countable sequences of random variables and lift results from realizations, so that you can do the ordinary stuff physicists do for a path integral, but there are also intuitions that fail. Uncountable ordinals don't embed in R, the theory is still ZF underneath, so the intuition from intuitive probability that R is enormously big, bigger than any ordinal, is not preserved in Solovay's model, it can't be in any model of ZF.

For this same reason, preserving powerset axiom, his construction is extremely complicated to follow, it requires collapsing a whole universe to consist entirely of countable sets (this is a Levy collapse of an inaccessible to omega-1, it was explained to me recently what this collapse does), and this required a large-cardinal hypothesis. The large-cardinal is overkill for intuitive probability, you don't need it. You need it to make the powerset axiom true, because now the powersets and the uncountable ordinals are totally separate sequences (and you need powersets for the ordinals, and so on). This is the real headache--- staying consistent with powerset.

The results of mathematical physicists are unacceptably weak, and unacceptably obfuscated. I will repeat, there is nothing wrong with intuitive probability, it is all just fear mongering. There is nothing really wrong with the axiom of choice either. it's the axiom of powerset that is causing difficulty, and this axiom is used to define R as a set, and to define functions, and so on. The whole point of redoing the foundations is to make R a proper class, as advocated recently by Nik Weaver in his forcing book.

Most recent comments show all comments

@RonMaimon: The Solovay model allows you to say it this way.

But to call it rigorous requires that someone has first reliably (and with the agreement of the mathematical community) checked that thousands of other little arguments needed to build a complete theory of stochastic processes are indeed (as you claim without pointing to a proof) valid in the Solovay model. Until someone writes a book on stochastic processes where this is carefully done none of your above arguments is mathematically rigorous. 

Moreover, your arguments do not apply to constructions where the operations are truly singular - then you need to analyze the singularity even in the Solovay model. But quantum field theory is full of singularities, and it is these that cause the trouble, not the lack of measurability.

By the way, you consistently write Weiner in place of Wiener - please edit this in your posts to this thread.

@ArnoldNeumaier: Without the nonmeasurable sets, there just aren't thousands of little arguments to do regarding any ordinary probability stuff, after the set theoretic headaches are resolved, there's nothing difficult left. The construction of white noise, or of free fields, or models which are defined by a precise Nicolai map, or stochastic equation requiring no renormalization, or continuous limits of Levy variables (this is not done yet AFAIK), or defining Ising model measures, or whatever always excluding renormalization,  is as straightforward as in elementary calculus. This simplification of ideas in intuitive probability is precisely why they put nonmeasurable sets on page one of any rigorous probability book, it's there to scare students away from intuitive probability, as mathematicians were scared in the 1920s, so that they think intuitive probability is logically inconsistent (false), just because it is inconsistent within ZFC (true).

Using the intuitive arguments, elementary probability is a piece of cake, which is why physicists were able to relatively easily and correctly develop sophisticated path-integral constructions of interest to pure mathematicians usually without ever taking a single course in measure theory. It's not because the physicists are using "heuristics" or "imprecise constructions", it's because the path integral belongs to intuitive probability, which is perfectly logically consistent, and mathematicians can't handle intuitive probability. The Axiom of Choice was just a gift from pure mathematics to the physics department, it means the mathematicians have to play catch-up for nearly 100 years.

The intuitive argument in probability go like this: for example, to "prove" that all subsets of [0,1] are measurable: consider a set S in [0,1] and sample [0,1] countably many times. For each sample, determine if it is in your set or not, and define the measure to be the limit of the number of generated random picks that land in S over the total number of picks. This argument is obvious, and it works intuitively and one feels that it "should" work as a rigorous argument in some way, but it can't be made rigorous in ZFC. It is circular in ZFC, because the definition of random variables is through measure theory, and the end result of translating the argument just becomes a justification for using the word "measure" as "probability". This non-argument then only works to show that measurable sets have the property that random variables land in them in proportion to their measure, or rather, more precisely (since only measurable sets can be "landed in", so the concept of "landing in" is not precise in ZFC) that the probability of the event of a random variables being an elements of a measurable set S is equal to the measure of S. But this random-picking argument is not circular (or at least not obviously circular) when the definition of random pick is by random forcing, and you have some sensible external framework in which to speak about adjoining new elements to R as you randomly generate them, and a good characterization of how to assign them membership to sets predefined by their generative properties (so that you understand what the "same interval" means in different models--- this is an essential part of Solovay's construction).

Continuing in this manner, if you have any way of taking countably many random variables (in a sensible intuitive probability universe) and forming a convergent sequence of distributions, you define a measure on any set of distributions which includes the image of the random variables with probability 1. The measure of any set X is just the probability that the construction ends up in X, which can be defined by doing it again and again, and asking what the fraction of throws that land in X is. This type of construction obviously doesn't qualify as a rigorous argument when there are nonmeasurable sets around, but it is completely precise when there aren't.

Each separate construction is one simple argument which requires no effort to remember or reproduce. It's like when you develop calculus, you don't need a separate argument with estimates for Riemann sums for the integral of 1/(x^2+1) as opposed to 1/x. You use a unified formalism.

For me, I don't care whether a construction is certified by a community as rigorous. I call a construction rigorous when there is a reasonable formal system which corresponds to some model of a mathematical universe, an equiconsistency proof of that system to a well-accepted one or a reasonable reflection of it, and a sketch of a proof that the formalization would go through in this system. The system I have in mind is not really Solovay's, because Solovay is complicated. But let's take Solovay's to start, because it works for this type of stuff, ZF+(dependent choice)+(Lebesgue measurability of R). Also it exists for sure, and is known to be equiconsistent with a well-established relatively intuitive large-cardinal hypothesis that is in no way controversial (possibly only because Grothendieck Universes are not controversial now, maybe because people get large cardinal hypotheses better, I don't know). I don't say this is the optimal model because Solovay's model is complicated to construct, and complicated to prove stuff about, because of a silly issue in the construction regarding collapsing the inaccessible, which is only required to be consistent with the axiom of power set.

I prefer a completely different system, which I just made up, which replaces the axiom of powerset with a different, to me more intuitive, thing. The basic idea is to make all countable models of ZFC (normal ZFC, including powerset), and speak about the universe as a container set theory for all these countable models, where the powersets are all proper classes containing all the powersets of all the different models. Then the notion of measure is for the class not for the sets (although there is a notion of measure on the sets also, as they are consistent with ZFC). The ambient theory has the axiom of choice, but it doesn't have an axiom of powerset, so it doesn't lead to measure paradoxes. It is just a theory of countable sets, so there is no issue with choice leading to nonmeasurable sets. The uncountabiltiy of R is the uncountability of R as a class, not as a set. I'll explain it in an answer to a self-answered question, but since I know it's not already out there, I want to run it past a logician that hates me first, because they might tear it apart correctly.

+ 0 like - 1 dislike

...He derives the path integral and shows it to be: $$\int_{q_a}^{q_b}\mathcal{D}p\mathcal{D}q\exp\{\frac{i}{\hbar}\int_{t_a}^{t_b} \mathcal{L}(p, q)\}$$

This is clear to me. He then likens it to a discrete sum $$\sum_\limits{\text{paths}}\exp\left(\frac{iS}{\hbar}\right)$$ where $S$ is the action functional of a particular path.

Now, this is where I get confused.

At this point I think it will be helpful to make an analogy with an ordinary Reimann integral (which gives the area under a curve).

The area A under a curve f(x) from x="a" to x="b" is approximately proportional to the sum $$ A\sim\sum_i f(x_i)\;, $$ where the $x_i$ are chosen to be spaced out from a to b, say in intervals of "h". The greater the number of $x_i$ we choose the better an approximation we get. However, we have to introduce a "measure" to make the sum converge sensibly. In the case of the Reimann integral that measure is just "h" itself. $$ A=\lim_{h\to 0}h\sum_i f(x_i)\;, $$

In analogy, in the path integral theory of quantum mechanics, we have the kernel "K" to go from "a" to "b" being proportional to the sum of paths $$ K\sim\sum_\limits{\text{paths}}\exp\left(\frac{iS_{\tt path}}{\hbar}\right) $$

In this case too, it makes no sense to just consider the sum alone, since it does not have a sensible limit as more and more paths are added. We need to introduce some measure to make the sum approach a sensible limit. We did this for the Reimann integral simply by multiplying by "h". But there is no such simple process in general for the path integral which involves a rather higher order of infinity of number of paths to contend with...

To quote Feynman and Hibbs: "Unfortunately, to define such a normalizating factor seems to be a very difficult problem and we do not know how to do it in general terms." --Path Integrals and Quantum Mechanics, p. 33

In the case of a free particle in one-dimension Feynman and Hibbs show that the normalization factor is $$ {({\frac{m}{2\pi i\hbar\epsilon}})}^{N/2};\, $$ where there are N steps of size $\epsilon$ from $t_a$ to $t_b$, and N-1 integrations over the intermediate points between $x_a$ and $x_b$.

Again, quoting from Feynman and Hibbs regarding these normalization measures: "...we do know how to give the definition for all situations which so far seem to have practical value."

So, that should make you feel better...

This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user hft
answered Mar 11, 2015 by hft (-10 points) [ no revision ]

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\varnothing$ysicsOverflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...