-2

I am using a C# program to save about 10 screen captures of size 660x330 per second.

This C# program is used as a .dll script for the game GTA V.

Bitmap catchBmp = new Bitmap(300, 660, System.Drawing.Imaging.PixelFormat.Format32bppRgb);
Graphics g = Graphics.FromImage(catchBmp);
g.CopyFromScreen(new Point(Screen.AllScreens[0].Bounds.Width / 3 + 170, Screen.AllScreens[0].Bounds.Height / 6 + 60), new Point(0, 0), new Size(300, 660));
string tmp = Path.Combine(path, modNum.ToString(), location);
if (!Directory.Exists(tmp))
{
    Directory.CreateDirectory(tmp);
}
string tmp_filename = tmp + "\\" + location + "_" + weather + "_" + time + "_" + anim + "_" + view_angle + "_" + modNum + "_" + picNum.ToString() + ".jpg";
catchBmp.Save(tmp_filename);
UI.ShowSubtitle(tmp_filename);
Wait(100);
catchBmp.Dispose();
g.Dispose();      

The code above is ran in a loop, running about 10 times per second.

After the script runs for about 6 hours or so, my hard drive seems to reach maximum usage and the script crashes.

Is there a way to prevent this? I would like to run this script for days at a time.

Would using a thread to save the images be helpful? Or implementing a queue while saving it to disk?

J.W Ngo
  • 66
  • 6
  • When you say "maximum usage", I assume you mean the speed that you can write to disk isn't enough to keep up with writing the images, and not that your hard disk runs out of space, right? – ProgrammingLlama Apr 08 '21 at 04:58
  • Yes, that is correct. The disk cannot keep up with the writing of multiple images and the program crashes. I tried writing to my SSD instead and did not face any problems, except for it running out of space after a day. – J.W Ngo Apr 08 '21 at 04:59
  • I thought of writing to the ssd as a cache before writing to hdd and deleting it from the ssd, but I would like to prevent unnecessary read/writes to my SSD if possible to extend its lifespan – J.W Ngo Apr 08 '21 at 05:01
  • Besides the hard drive, was any other hardware resource being maxed out (ram, cpu)? – Hayden Apr 08 '21 at 05:03
  • 1
    That means `864,000` images per day. What are you going to do with that :)? Maybe build a sub-folder per hour? Handling this all seems a little overwhelming anyway. Maybe dump it all to a Video format? A zipped archive? Distribute to NAS storage? Async upload to web storage? – Jimi Apr 08 '21 at 05:05
  • @Hayden No other resources are maxed out – J.W Ngo Apr 08 '21 at 05:07
  • @Jimi :) I'm collecting a synthetic dataset for computer vision, I actually already have multiple subfolders, with each holding only 576 images at one time – J.W Ngo Apr 08 '21 at 05:08
  • correct me if im wrong, I would imagine saving to a NAS storage directly would be slower than saving to disk. My issue isn't the lack of space but the disk being a bottleneck which I am hoping to be able to solve in C#. – J.W Ngo Apr 08 '21 at 05:11
  • 10 Images per second are not a problem if you choose a form of remote storage. Needs some caching an throttling the tasks, but it's quite doable. Much better than squashing a poor SSD (it shouldn't even contain the System Temp folder(s)). – Jimi Apr 08 '21 at 05:13
  • So, for example, saving it into a memory buffer in C# and moving it to a NAS storage directly? Is it possible to move it directly to a NAS storage in C# code? I guess this is one solution I could try, but in the best case scenario I would prefer for no throttling. – J.W Ngo Apr 08 '21 at 05:16
  • Well, you have IO operations, which are async by nature. So you usually enqueue work items and dequeue in worker Tasks, depending on how many different storage units you have available. This creates a sort of *cache* that lets the main Thread work on its main task (collecting Images) and the worker Tasks distribute the results to the available resources. -- There are some very good examples of this kind of procedure here (you can find many that use HttpClient, as an example. The logic is the same anyway) – Jimi Apr 08 '21 at 05:29
  • 1
    By the way you should be disposing `catchBmp` and `g` with `using` blocks – Charlieface Apr 08 '21 at 09:20
  • Are you aware that while you call the image `.jpg`, a save without parameters will save as png format? – Nyerguds Apr 08 '21 at 14:55

1 Answers1

1

This seem to be a simple issue of the disk not keeping up.

Edit: As Nyerguds pointed out in the comments, you need to specify the image format when saving, i.e. catchBmp.Save(tmp_filename, ImageFormat.Jpeg); Otherwise png compression will be used, and this will require more bandwith to the disk.

One option would be to use a more efficient storage format, i.e. save as a video file that can use the difference between frames for better compression. See for example video-writing in aforge. While this should decrease the size of the data being written, it will increase the CPU load of the system.

Another option would be to reduce the quality of the image files, either by reducing the resolution or adjusting the compression parameters. You could also switch image formats. Jpeg is an old standard, and tend to produce low quality images at low bitrates, a more modern format like jpeg xr might work better.

Third option would be to simply get a faster disk, and/or ensure the program have exclusive access to the disk.

Using a queue or thread will probably not help much since the code is not blocking anything else, and it only runs as fast as it can. I would however recommend disposing the graphics before the bitmap, and move the wait after the disposing.

JonasH
  • 28,608
  • 2
  • 10
  • 23
  • Note that it's not even actually saving as jpeg. The code gives no type parameter to the save, which makes it default to png. – Nyerguds Jul 16 '21 at 10:21
  • @nyerguds I believe that Image.Save will use the file extension to determine the encoder if not explicitly set, and only use .png if it cannot find an encoder. – JonasH Jul 16 '21 at 10:52
  • It does not. It doesn't care at all about the file name you give. – Nyerguds Jul 17 '21 at 12:27
  • @Nyerguds, You are correct, I have updated the answer. – JonasH Jul 19 '21 at 08:12