One note is that PCA will give you the same results on any matrix as it will on its transpose. However, the running time can be very different in the two cases because if you have an MxN design matrix, the algorithm will try to generate an NxN matrix. If you have the correct transpose, this will be a nice 100x100 matrix, but if you have the wrong one this will be a 100Mx100M matrix.
Another trick for datasets with high dimensionality is that you can randomly project the data to a lower dimensional space using random Gaussian vectors. By the Jordan-Lindenstrauss lemma, the projected dataset will have statistically similar properties under PCA.