When I try to save a large (about 70GB) files in Rust, it causes a core dump and stops the process. At first I thought it might be a lack of storage, so I investigated, but it was not. Here is how the error I got.
memory allocation of 73603432908 bytes failed
/var/spool/uge/at163/job_scripts/12220153: line 8: 46483 Aborted
What I am using is
$ rustup --version
rustup 1.23.1 (3df2264a9 2020-11-30)
info: This is the version for the rustup toolchain manager, not the rustc compiler.
info: The currently active `rustc` version is `rustc 1.50.0 (cb75ad5db 2021-02-10)`.
info: The currently active `rustc` version is `rustc 1.50.0 (cb75ad5db 2021-02-10)`.
Also, the usage crate uses ndarray
.
When saving the file, it is saved in .npy
format using ndarray-npy
.
I explain how I use ndarray-npy to store my data. The code is as follows
use ndarray; //0.14.0
use ndarray::Array3;
use ndarray_npy::write_npy; //0.7.1
struct Features {
input_o: Array3<f32>, // 70GB
input_c: Array3<f32>, // 70GB
input_ca: Array3<f32>, // 70GB
input_n: Array3<f32>, // 70GB
target_o: Array3<f32>,
target_c: Array3<f32>,
target_ca: Array3<f32>,
target_n: Array3<f32>,
}
fn main() {
// do something
write_npy(
&(dir.to_string() + &fname + "_input_C.npy"),
&features.input_c,
)
.unwrap();
write_npy(
&(dir.to_string() + &fname + "_input_CA.npy"),
&features.input_ca,
)
.unwrap();
write_npy(
&(dir.to_string() + &fname + "_input_N.npy"),
&features.input_n,
)
.unwrap();
write_npy(
&(dir.to_string() + &fname + "_input_O.npy"),
&features.input_o,
)
.unwrap();
write_npy(
&(dir.to_string() + &fname + "_target_C.npy"),
&features.target_c,
)
.unwrap();
write_npy(
&(dir.to_string() + &fname + "_target_CA.npy"),
&features.target_ca,
)
.unwrap();
write_npy(
&(dir.to_string() + &fname + "_target_N.npy"),
&features.target_n,
)
.unwrap();
write_npy(
&(dir.to_string() + &fname + "_target_O.npy"),
&features.target_o,
)
.unwrap();
}