1

In short, I'm making a simulation where I have a bunch of creatures that can see each other. The way I want to do this is to capture an area around each creature and give it to their neural network, and make them evolve to recognize their surroundings. I am coding this using LibGDX, and I don't plan on making screenshots every single frame because I can imagine that that is already a very poor idea. However, the problem is that I don't know how to get the pixels inside a defined square without capturing the entire screen and then cherry picking what I want for each creature, which will cause a MASSIVE lag spike, since the area these creatures will be in is 2000x2000, and therefore 12 million different values (4 million RGB values).

Each creature is about 5 pixels (width and height), so my idea is to give them a 16x16 area around them, which is why iterating through the entire frame buffer won't work, it would pointlessly iterate through millions of values before finding the ones I asked for.

I would also need to be able to take pictures outside of the screen (as in, the part outside the window's boundaries), if that is even possible.

How can I achieve this? I'm aiming for performance, but I do not mind distributing the load between multiple frames or even multithreading.

Ciro García
  • 601
  • 5
  • 22

1 Answers1

0

The problem is you can't query pixels in a framebuffer.

You can capture a texture from a framebuffer, and you can convert a texture to a pixmap. libgdx TextureRegion to Pixmap

You can then getPixel(int x, int y) against the pixmap.

However, maybe going the other way would be better.

Start with a pixmap, work with the pixmap, and for each frame convert the pixmap to a texture and render that texture fullscreen. This also removes the need for the creatures environment to match the screen resolution (although you could still set it up like that).

londonBadger
  • 611
  • 2
  • 5
  • 5