Just to be clear: by "amplitude" you mean the amplitude of a classical electromagnetic wave -- that is, the peak value of the electric field -- right? In that case, the answer is that the amplitude goes down.
For definiteness, let's consider a wave packet of electromagnetic radiation with some fairly well-defined wavelength. At some early time, it has a wavelength $\lambda_1$ and energy $U_1$. (I'm not calling it $E$ because I want to reserve that for the electric field.) After the Universe has expanded for a while, it has a longer wavelength $\lambda_2$ and a smaller energy $U_2$. (Fine print: wavelengths and energies are measured by a comoving observer -- that is, one who's at rest in the natural coordinates to use.) In fact, the ratios are both just the factor by which the Universe has expanded:
$$
{\lambda_2\over\lambda_1}={U_1\over U_2}={a_2\over a_1}\equiv 1+z,
$$
where $a$ is the "scale factor" of the Universe. $1+z$ is the standard notation for this ratio, where $z$ is the redshift.
The physical extent of the wave packet is also stretched by the same factor. So the energy density in the wave packet goes down by a factor $(1+z)^2$.
What does that mean about the amplitude of the wave? The energy density in the wave packet is proportional to the electric field amplitude squared. So if the energy density has gone down by $(1+z)^2$, the electric field amplitude must have gone down by $(1+z)$.
Specifically, if the Universe doubles in size, the wavelength of any given wave packet doubles, and the amplitude (peak value of ${\bf E}$) is cut in half.
This post imported from StackExchange Physics at 2014-04-01 16:44 (UCT), posted by SE-user Ted Bunn