0

Thanks in advance.

We have known that PCA is used to remove some redundant, or linear-dependent feature/dimension(e.g. km and inch features) in the original data set. Furthermore, eigenvalues in descend-order just act as a weight, telling us which new feature/dimension in the new orthogonal space is the most/least important.

However, It seems that eigenvalue has no explicit relation to the old feature/dimension in the original data set.

Here is my question, after applying PCA, can we look back, and find out the specific redundant(near-linear-dependent ) features in the original data set?

Here is the similar question asked previously, but no exact answers is provided so far. Hence, any help will be greatly appreciated.

Community
  • 1
  • 1

1 Answers1

0

No, but you don't need PCA for that. Just look at the variance in your original dimensions.

Don Reba
  • 13,814
  • 3
  • 48
  • 61