Intro: Lets' set of points (x, y) on a plane and func F(x, y) returning double. Points're generated randomly in the user-defined rectangle(which means constraint on variables). Let's set the number of gerated points as N. This set than is triangulated(Delaunay triangulation). Imagine that this triangulation sets the approximation of surface of func F(x, y). Values on edges vary linearly(so it's just a ranges between points(x, y, z = F(x, y)). Vertices are vertices of interpolation. So we have triangles itself as the interpolation planes.
I draw triangulation on a standard PictureBox
. The edges are drawn using LinearGradientBrush
- color reflect the level of value F(x, y)(brown - max value, aqua - min value, green - middle).
It all works until I set N >= M, where M varies from time to time from ~30k to 50k. In these cases I get Out of memory exception. And while I don't see lack of operative memory in task manager I think it refers to video memory, isn't it?
I can draw triangulation with N = 100k without using color gradients(just simple solid one-color edges) so I think the problem is in LinearGradientBrush
using or in creation of GDI objects Pen
and LinearGradientBrush
on every iteration.
Some code:
private void pboxTriangulation_Paint(object sender, PaintEventArgs e)
{
e.Graphics.SmoothingMode = SmoothingMode.AntiAlias;
if (null != dt)
PaintWithGradient(e.Graphics);
}
private void PaintWithGradient(Graphics g)
{
//here I erased code block
//where I find min, max and middle values of F(x, y) on a set
foreach (Triangle t in dt.triangles)
{
System.Drawing.PointF[] ps = new System.Drawing.PointF[3];
Color[] colors = new Color[3];
for (int i = 0; i < 3; i++)
{
//here for every i-th point I find its display coordinates - ps[i]
//and color of vertex - colors[i]
}
for (int i = 0; i < 3; i++)
{
using (LinearGradientBrush b = new LinearGradientBrush(ps[i], ps[(i + 1) % 3], colors[i], colors[(i + 1) % 3]))
{
using (Pen pen = new Pen(b))
{
g.DrawLine(pen, ps[i], ps[(i + 1) % 3]);
}
}
}
}
}
Examples: n = 10k, n = 100k (built as x86), n = 200k (built as x86)
Question: What is the problem and how I can solve it?
Please, don't suggest another drawing tools as a solution(e.g. OpenGL and other strong tools) but if you want just write it as a commentary to question - I'll remember it and maybe it will be useful in future. This question is not about how to draw triangulation with gradient edges in a broad sense but what is the root of exception in my case.
UPD: Program was built in x64. In x86 it draws 200k (not tried higher N) and seems ok. 300k fell after some time with ~1.8gb allocated.