Suppose we have normalized states |n(→R)⟩ indexed by continuous variable →R. Then fixing our choice of gauge and ignoring dynamic phase, the phase difference between two states is the Berry's Phase:
⟨n(→R0)|n(→R0+Δ→r)⟩ = eiγ
where, if C is some curve that goes between →R0 and →R0+Δ→r,
γ = i∫C⟨n(→R)|∇→R|n(→R)⟩⋅d→R
If Δ→r is small, then
γ≈i⟨n(→R0)|∇→R|n(→R)⟩|→R=→R0⋅Δ→r
However, we can directly calculate this as well:
⟨n(→R0)|n(→R0+Δ→r)⟩ ≈ ⟨n(→R0)|n(→R0)⟩+⟨n(→R0)|∇→R|n(→R)⟩|→R=→R0⋅Δ→r
≈1+⟨n(→R0)|∇→R|n(→R)⟩|→R=→R0⋅Δ→r ≈ exp[⟨n(→R0)|∇→R|n(→R)⟩|→R=→R0⋅Δ→r]
and therefore
⟨n(→R0)|n(→R0+Δ→r)⟩ ≈ e−iγ
There's a minus sign now! What am I doing wrong here?
This post imported from StackExchange Physics at 2014-03-24 04:13 (UCT), posted by SE-user ChickenGod