the width $\sigma$ of a Gaussian distribution

\begin{equation}

p(x)\mathrm{d}x =

\frac{1}{\sqrt{2\pi\sigma^2}}\exp\left(-\frac{x^2}{2\sigma^2}\right)\mathrm{d}x

\end{equation}

can be obtained by integration of the second moment, $\langle x^2\rangle = \int\mathrm{d}x\:p(x)x^2 = \sigma^2$ or by differentiation and solving $\mathrm{d}^2p/\mathrm{d}x^2=0$. can you show that this property is unique for the Gaussian distribution?

mmh, CQW thinks this question is actually a tough one. to show that both methods yield the variance $\sigma^2$ is straightforward, but for showing that this is unique for the Gaussian can perhaps (and we'd really be interested in a better solution) be shown like this: the Gaussian can be written as the Fourier-transform if its characteristic function, which is again a Gaussian,

ReplyDelete\begin{equation}

p(t) = \exp(-t^2\sigma^2/2)

\end{equation}

such that the second derivative for finding the variance reads:

\begin{equation}

\frac{\mathrm{d}^2p}{\mathrm{d}x^2} =

\int\mathrm{d}t\:t^2\exp(-t^2\sigma^2/2)\exp(\mathrm{i}tx).

\end{equation}

which should vanish for $x=\sigma$: this can be shown by completing the square, which yields after a change of variables

\begin{equation}

\int\mathrm{d}y\:(y^2-1)\exp(-y^2/2) = 0

\end{equation}

which is actually true for a Gaussian, because of the orthogonality relation of of the Hermite polynomial $He_2(y) = y^2-1$ with $He_0(y) = 1$ - the Hermite polynomials are constructed as being orthogonal with a Gaussian as the weighting function and are unique due to the Gram-Schmidt procedure.