Wanted: Gaussian Random Number Generator

der Mouse mouse at mcgill-vision.UUCP
Sun Oct 1 14:00:45 AEST 1989


In article <1820002 at hpsad.HP.COM>, cj at hpsad.HP.COM (Chris Johnson) writes:
> I'm looking for a random number routine which will generate numbers
> with a Gaussian distribution.  Rand(3C) and other such programs
> produce (semi) even distributions.

They also produce integers.

If you have a routine that produces floating-point random numbers in
the range [0,1), Knuth has an algorithm which produces nice Gaussian
numbers.  Here's an extract from the comment header for the Gaussian
random number routine from a local library:

 * Algorithm is the polar method, from Knuth _Art_Of_Computer_Programming_
 * Volume 2 (Seminumerical Algorithms), chapter 3 (random numbers).
 * This is Algorithm P in section 3.4.1.
 *
 * Algorithm P (Polar method for normal deviates)
 *  This algorithm calculates two independent normally distributed
 *  variables X1 and X2, given two independent uniformly distributed
 *  variables U1 and U2.
 *
 * P1. [Get uniform variables.]
 *	Generate U1 and U2.
 *	Set V1 <- 2U1 - 1, V2 <- 2U2-1.
 * P2. [Compute S]
 *	Set S <- (V1*V1) + (V2*V2)
 * P3. [Is S >= 1]
 *	If S >= 1, return to step P1.
 * P4. [Compute X1, X2]
 *	Set X1 <- V1 * t, X2 <- V2 * t, where t is sqrt((-2 ln(S))/S)

The code itself is in assembly, which is why I'm not posting it.

There are two points that perhaps need comment.  One is that the number
are generated in pairs.  Knuth includes a proof that if the underlying
uniform generator produces independent numbers when called to generate
pairs this way, the resulting Gaussian numbers will be independent.
The other point is that the above algorithm is a theoretical algorithm.
In particular, S==0 can occur with non-zero probability in an
implementation, while the above algorithm appears to assume that since
the probability of S being 0 is mathematically zero, the issue can be
ignored.  (Note that the computation of t involves dividing by S.)

It also may be unacceptably expensive; a square root and a natural
logarithm are required for each pair of numbers.

Depending on what you want them for, you may be able to cheat and just
add together a handful of uniform numbers.  This produces near-Gaussian
distributions relatively cheaply.  (Adding N independent uniform [0,1]
numbers together produces a pseudo-Gaussian of mean N/2 and variance
N/12.  A reasonable choice of N is 12, thus automatically producing a
variance of 1.)  These pseudo-Gaussians are missing the long tails, but
for some applications (eg, adding Gaussian noise to a picture) the
tails don't matter much.

					der Mouse

			old: mcgill-vision!mouse
			new: mouse at larry.mcrcim.mcgill.edu



More information about the Comp.lang.c mailing list