The easiest is to look at characteristic functions and cumulants; for a random variable T with PDF p(t) we say T ~ p and define
φ_T[f] = ∫ dt p(t) exp(-2πift) = ⟨ exp(-2πifT) ⟩
If two variables X ~ r and Y ~ s are independent then you can prove [from ⟨f(X) g(Y)⟩ = ⟨f(X)⟩ ⟨g(Y)⟩ or X+Y ~ q where q(z) = ∫ dx r(x) s(z — x)] that their sum has a characteristic function
φ_{X+Y} = φ_X + φ_Y
And therefore the “sample mean” M of n IID variables is itself a random variable with characteristic function
φ_M[f] = ( φ[f/n] )^n.
So we find that
log φ_M[f] = n log φ[f/n] ≈ 0 + i a f – b f²/n + O(f³/n²).
These terms [a, b] from expanding the log of the characteristic function constitute the cumulant expansion and for large n the other terms shrink to zero, so that the characteristic function is to first order in 1/n a Gaussian.
The characteristic function was a Fourier transform of a PDF, so an inverse Fourier transform gets it back:
p(t) = ∫ df φ_T[f] exp(2πift)
But the Fourier transform of a Gaussian is just a Gaussian.