1

Suppose I have a $n \times p$ data matrix $X$, $p>>n$. To reduce the dimension of the data, I use principal component analysis as follows: I perform SVD and find matrices U ($n \times r$) and V ($r \times p$) such that $X=UDV$, where $D$ is a diagonal matrix. Now I reduce the dimension of $X$ using the matrix $V$, i.e., use the PC scores $Z=XV^{\prime}$. My question is in that case does the property like 'restricted isometry' hold for the projected data point. In particular, if I consider the rows of $X$ are independently generated from some distribution, then what are the most sharp bounds ($m$, $M$) for which the following hold

$$ m \| x \|^2 \leq \| Vx \|^2 \leq M \| x \|^2 ?$$

hippietrail
  • 15,848
  • 18
  • 99
  • 158
  • Nice question but you should probably ask it at https://math.stackexchange.com/ – harold Jul 21 '17 at 15:42
  • Thank you. I should do that. – user2660120 Jul 21 '17 at 15:52
  • Since it is related to machine learning, I think it has its place on SO too. It's borderline though, yes. It's a shame that we can't use Latex directly on SO, but because of that, please just use different formatting (like putting your variables between backticks, etc) next time. – Ash Jul 21 '17 at 16:50
  • 2
    Actually [Cross Validated](https://stats.stackexchange.com/) is the place in my opinion. – Royi Jul 21 '17 at 16:55
  • Sorry about the post and the script too. I have posted the same in math.stackexchange.com. May be, I will post it in Cross Validated too. – user2660120 Jul 21 '17 at 19:30

0 Answers0