2

I am new to rust. In past few days I learnt a lot from this community. With your help, I have made a code which reads 5000 files, do some transformation and write 700 files. However, I am still having issues with performance and I am trying to optimise. I could see that Writing to file was taking some time, so wanted to get an opinion on how to optimise it

//file_holder is hashmap
//file is a custom struct with 3 hashmap<String, f32>
file_holder.par_iter().for_each(|(name,file)| {
        serde_json::to_writer(&fs::File::create(name.to_string() +".json").expect("Failed writing"), &file).expect("Failed writing");
    });
Selva
  • 951
  • 7
  • 23
  • Does this answer your question? [What's the de-facto way of reading and writing files in Rust 1.x?](https://stackoverflow.com/questions/31192956/whats-the-de-facto-way-of-reading-and-writing-files-in-rust-1-x) – E_net4 May 09 '20 at 16:16
  • As described in the answer, you can consider wrapping the file with a buffered writer. Any other kind of analysis would require more details than the ones described here. – E_net4 May 09 '20 at 16:17
  • Thanks for response. I saw that question earlier but not sure how I can implement with serde. I like the serde way of serialising a struct. One thought was to convert struct into json string and then use buf. But that might still be not optimised – Selva May 09 '20 at 17:16
  • Also, those were for single files. In my case I have 700 files – Selva May 09 '20 at 17:35
  • 1
    The buffered writer can be used with Serde. It would be something like: `serde_json::to_writer(BufWriter:new(fs::File::create(...`. – rodrigo May 09 '20 at 19:09

1 Answers1

2

Thanks to Rodrigo, It worked.

file_holder.par_iter().for_each(|(k,file)| {
        let f = fs::File::create(k.to_string() +".json").expect("Unable to create file");
        let mut bw = BufWriter::new(f);
        serde_json::to_writer(bw, &file).expect("Failed writing :(");
    });
Selva
  • 951
  • 7
  • 23