the memory usage is about 1,400 MB
It isn't yet clear whether that's the working set size or the VM size of the process. If you use Task Manager then you typically look at working set, the amount of RAM being used by a program.
In which case 1,400 megabytes most certainly puts you squarely in the danger zone for a 32-bit process. Allocations start to fail when the VM size of a process creeps towards the maximum amount of virtual memory that a 32-bit process can address, 2 gigabytes. You are almost certainly very close to that limit.
The kind of allocations that will fail are large ones, there just isn't a hole big enough left in the address space to fit the requested size. A Bitmap is certainly a very good candidate since they can consume a lot of VM space for their pixel data.
You'd expect an OutOfMemoryException in this case. But GDI+ generates crappy exceptions, it assumes that the real problem is that you asked for a bitmap that's too large. That's not entirely without merit but gets to be a bit of a tough sell when you create a bitmap of 465 x 465 pixels. Anyhoo, it generates the "Parameter is invalid" exception message instead of OOM, blaming your requested size.
It isn't very clear how you ended up using so much memory. But it is a very common when you use the Bitmap class in .NET code, the one .NET class where you can no longer ignore the Dispose() method. A strong hint that you don't take care of this is your remark that you pre-allocate it as a 200x200 bitmap. That's a very bad idea, you almost certainly don't dispose that.
So the very first thing you need to do is thoroughly review your code and dispose your bitmaps when they are no longer in use. Use a memory profiler if necessary to find the leaks. Changing the Platform target setting of your EXE project from x86 to AnyCPU is a very simple way to get oodles of VM space on a 64-bit operating system. Long term, you'll probably end up replacing this code anyway, using the BitmapSource and WriteableBitmap classes.