Quantcast
Channel: Math.NET Numerics
Viewing all articles
Browse latest Browse all 971

New Post: Cut cols and rows (or factor the matrix)

$
0
0
cuda wrote:
That really doesn't matter much since a 500Kx500K dense matrix would take up about 2TB of memory and way above the 2GB maximum object size. We'd have introduce a different storage scheme to handle dense matrices of that size (assuming the machine had 8TB of memory).
Well, if original data 500Kx500K and it is 99% sparse the memory consumption with proper data structure will be only 20Gb (1% of 2Tb).
And this is somewhat handable even if resulting matrices are dense and approximated by K already. (with small Ks of course).

However, I do not know if this is possible at all to make SVD with approximation "built in" without a need to keep 4 500Kx500K in the memory during actual decomposition. I mean that maybe decomposition math requires to have all matrix to be in memory to calculate results.

Also good to mention that 500Kx500K is extreme case. In my case I need to handle 500 000 x 20 000 records. However, in "dense form" it consumes as much as ~55Gb of RAM that is unacceptable. And again, that's why I am looking for a way to get less memory consumption during SVD by cutting "un-needed" columns and rows of the resulting matrices.

Best regards, Alex.

Viewing all articles
Browse latest Browse all 971

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>