On 11/24/2012 1:16 PM, Dan Asimov wrote:
Hmm, so I wonder what eigenvalue-sign frequencies would arise if instead of the uniform distribution on [-1,1], one used the standard normal, independently for p, q, r, s.
(Also: is there a simple explanation for why Edwin found almost identical frequencies for two as for zero positive eigenvalues?)
Of course that's easy. If e is an eigenvalue then there is another matrix with a sign change such that -e is an eigenvalue. Under any reasonable measure these two matrices are equally probable. So having 0 out of 2 is as probable as having 2 out of 2. What is is puzzling is why these are not of probability 1/4. The implication is that if one eigenvalue, e1>0 then the other eigenvalue is more likely e2<0. But suppose you generate the random Hermitian matrices by choosing the eigenvalues and then a random unitary matrix to rotate them into a random basis. In that case you would clearly get the same measure for e1,e2 and e1, -e2. I realize this is not a proof because the rotation doesn't necessarily give you an Hermitian matrix, only a normal matrix. But I don't think that effects the conclusion that if you restrict the random matrices to the Hermitian ones you must still get equal measure for e1,e2 and e1,-e2. It is not correct to argue that because the eigenvalues are not independent that this condition doesn't hold. It is only necessary that the *signs* be independent. Brent