0

I'm using classical PCA from scikit-learn for a classification model using LogisticRegression for a dataframe having 22 000 rows * 45 000 columns, data are scaled, but both have convergence issues when I upgraded python 3.6 to 3.7 and scikit-learn 0.19 to 0.23.

raise LinAlgError("SVD did not converge") LinAlgError: SVD did not converge

I have no NaN in my dataframe, but it can be a memory issue.

from sklearn.decomposition import PCA
   
n_components = int(X_main.shape[0] / 5)

pca = PCA(n_components=n_components)
principalComponents = pca.fit_transform(X_main)

principalDf = pd.DataFrame(data=principalComponents)

I changed nothing else, I resolved this by changing some hyper-parameters but i wonder why this appears, I've tested when I rollback to older versions and convergence error "disappear".

I don't know why, is there a warning now for convergence or is it computing another way, or other reasons ?

Thanks for you advice :)

w_fabien
  • 1
  • 1
  • Please add a snippet of the code that you are running – skibee Oct 25 '20 at 21:03
  • The answers here provide some other reasons this error is raised: https://stackoverflow.com/questions/21827594/raise-linalgerrorsvd-did-not-converge-linalgerror-svd-did-not-converge-in-m – JAV Oct 26 '20 at 00:42
  • Does this answer your question? [raise LinAlgError("SVD did not converge") LinAlgError: SVD did not converge in matplotlib pca determination](https://stackoverflow.com/questions/21827594/raise-linalgerrorsvd-did-not-converge-linalgerror-svd-did-not-converge-in-m) – JAV Oct 26 '20 at 00:42
  • yep i've read that, I'm thinking that there is a memory error, but i don't understand why it's raised when i upgrade my libraries and disappear when I rollback – w_fabien Oct 28 '20 at 08:04

0 Answers0