Double up Gaussian random numbers.

Gaussians & Eigenspace Splitting

Applying fast Walsh Hadamard transform eigenspace splitting to a vector of random numbers from the Gaussian (Normal) distribution doubles the number of Gaussian variables, yet they still pass most of the test for the Gaussian distribution.

A Hadamard (or any fixed linear) transform of a Gaussian vector is still Gaussian, so the two subvectors you get from splitting the transformed vector will (marginally and jointly) have Gaussian distributions and — for the usual iid zero-mean, equal-variance input — will pass the usual Gaussian tests in the limit of exact arithmetic and large samples.

You define:

u=(I+H)x

v=(IH)x

where H is the normalized Hadamard transform matrix (HHT=IH H^T = I).

  • H has eigenvalues +1 and 1.

  • I+projects onto the +1 eigenspace (up to a factor of 2).

  • IH projects onto the –1 eigenspace (also up to a factor of 2).

So u and v are each living entirely in orthogonal subspaces of Rn.


2. Distribution of u and v

If xN(0,I) (iid standard Gaussian entries):

  • Any linear transform of a Gaussian vector is still Gaussian.

  • (I+Hand (IH are both symmetric and orthogonal-projector-scaled matrices:

    (I+H)(IH)=IH2=0

    showing the images of u and v are orthogonal.


3. Covariance structure

We can compute:

Cov(u)=(I+H)I(I+H)T=(I+H)(I+H)=2(I+H)

and similarly:

Cov(v)=2(IH)

The cross-covariance is:

Cov(u,v)=(I+H)I(IH)=(I+H)(IH)=IH2=0

So u and v are uncorrelated.


4. Gaussian ⇒ independence

For jointly Gaussian vectors, uncorrelatedness ⇒ independence.
Thus, u and v are independent Gaussian vectors, each living in its own orthogonal subspace, with their own covariance matrices 2(I±H).

If you normalize appropriately (e.g. divide by 2) each has iid Gaussian coordinates in its own reduced-dimensional space.


5. Passing Gaussian tests

Because u and v are exactly Gaussian and independent, each one individually:

  • Has the correct Gaussian marginal distribution.

  • Will pass virtually all Gaussianity tests (KS, Shapiro–Wilk, skewness/kurtosis tests) except for normal finite-sample false rejections.

  • Any deviations you see will come from numerical floating-point effects, not from the math.


💡 Subtle note:
If you take u and v as full n-dimensional vectors without removing their zero-projection components, those components will be deterministically zero and obviously not Gaussian. You have to restrict each to its eigenspace dimension (n/2 for Hadamard) before testing.


Possible uses


1. Communications

  • Noise-like signaling:
    You could transmit either u or v as “noise” over a channel. To a naive observer, it looks like Gaussian noise.
    But a receiver who knows the Hadamard matrix HH can project onto the two eigenspaces and recover the bit “was this u or v?”.

  • This is essentially a spread-spectrum watermark: the eigenspace label acts as the hidden payload.


2. Steganography in noise

  • In simulations or media where Gaussian noise is expected, you can hide a binary message by choosing which eigenspace to draw the noise from.

  • Without knowledge of H and the exact projection scheme, the two noise sources are statistically indistinguishable in standard Gaussian tests.

  • This is a form of covert channel that survives most naive statistical filtering.


3. Keying / authentication

  • If you control the noise source, you can embed a “secret key” into it by modulating which eigenspace you sample from.

  • At the receiving end, applying HH and checking the projection tells you whether the source is authentic.


4. Simulation experiments

  • In Monte Carlo methods, you could use uu for one set of runs and vv for another to guarantee orthogonality between datasets while keeping all marginal Gaussian properties.

  • This can help in variance reduction techniques where you want two streams that look identical in distribution but are guaranteed independent and orthogonal in a known way.


5. Testing

  • Use the tainted variables to stress-test statistical procedures:

    • If your statistical model assumes Gaussian white noise, will it notice that the data comes from a restricted eigenspace?

    • Many machine learning models ignore covariance structure — this is a way to check their sensitivity.





Comments

Popular posts from this blog

Neon Bulb Oscillators

23 Circuits you can Build in an Hour - Free Book

Q Multiplier Circuits