You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Up to now, when running pyPCGA twice or more with the same parameters, the results are slightly different each time.
This is what I attribute to the covariance matrix low-rank approximation relying on scipy.sparse.linalg.eigsh beauce the initial vector v0 is not provided and consequently chosen randomly.
The solution would be to let the possibility for the user set a seed (aka random_state) to generate a reproducible v0.
Thank you Antoine and you are right that low-rank approx will not give users unique vectors. I guess I implemented it with oversampling parameters (let's say the number of eigenvectors computed to k + p where p is an oversampling parameter so that later we keep only "k" eigenmodes - this technique commonly used in randomized low-rank approximation) so that users expect less variability in results but not very sure. User-specified random seed would be a great option for reproducible results. I will take a look at it and will merge your PR. Happy holidays!
Up to now, when running pyPCGA twice or more with the same parameters, the results are slightly different each time.
This is what I attribute to the covariance matrix low-rank approximation relying on
scipy.sparse.linalg.eigsh
beauce the initial vector v0 is not provided and consequently chosen randomly.The solution would be to let the possibility for the user set a seed (aka random_state) to generate a reproducible v0.
See: https://stackoverflow.com/a/52403508
The text was updated successfully, but these errors were encountered: