...He derives the path integral and shows it to be:
$$\int_{q_a}^{q_b}\mathcal{D}p\mathcal{D}q\exp\{\frac{i}{\hbar}\int_{t_a}^{t_b}
\mathcal{L}(p, q)\}$$
This is clear to me. He then likens it to a discrete sum
$$\sum_\limits{\text{paths}}\exp\left(\frac{iS}{\hbar}\right)$$ where
$S$ is the action functional of a particular path.
Now, this is where I get confused.
At this point I think it will be helpful to make an analogy with an ordinary Reimann integral (which gives the area under a curve).
The area A under a curve f(x) from x="a" to x="b" is approximately proportional to the sum
$$
A\sim\sum_i f(x_i)\;,
$$
where the $x_i$ are chosen to be spaced out from a to b, say in intervals of "h". The greater the number of $x_i$ we choose the better an approximation we get. However, we have to introduce a "measure" to make the sum converge sensibly. In the case of the Reimann integral that measure is just "h" itself.
$$
A=\lim_{h\to 0}h\sum_i f(x_i)\;,
$$
In analogy, in the path integral theory of quantum mechanics, we have the kernel "K" to go from "a" to "b" being proportional to the sum of paths
$$
K\sim\sum_\limits{\text{paths}}\exp\left(\frac{iS_{\tt path}}{\hbar}\right)
$$
In this case too, it makes no sense to just consider the sum alone, since it does not have a sensible limit as more and more paths are added. We need to introduce some measure to make the sum approach a sensible limit. We did this for the Reimann integral simply by multiplying by "h". But there is no such simple process in general for the path integral which involves a rather higher order of infinity of number of paths to contend with...
To quote Feynman and Hibbs:
"Unfortunately, to define such a normalizating factor seems to be a very difficult problem and we do not know how to do it in general terms."
--Path Integrals and Quantum Mechanics, p. 33
In the case of a free particle in one-dimension Feynman and Hibbs show that the normalization factor is
$$
{({\frac{m}{2\pi i\hbar\epsilon}})}^{N/2};\,
$$
where there are N steps of size $\epsilon$ from $t_a$ to $t_b$, and N-1 integrations over the intermediate points between $x_a$ and $x_b$.
Again, quoting from Feynman and Hibbs regarding these normalization measures:
"...we do know how to give the definition for all situations which so far seem to have practical value."
So, that should make you feel better...
This post imported from StackExchange Physics at 2015-05-13 18:55 (UTC), posted by SE-user hft