3

I have a system that generates chunks of 2d game map tiles. Chunks are 16x16 tiles, tiles are 25x25.

The chunks are given their own coordinates, like 0,0, 0,1, etc. The tiles determine their coordinates in the world based on which chunk they're in. I've verified that the chunks/tiles are all showing the proper x/y coordinates.

My problem is translating those into screen coordinates. In a previous question someone recommended using:

(worldX * tileWidth) % viewport_width

Each tile's x/y are run through this calculation and a screen x/y coordinate is returned.

This works for tiles that fit within the viewport, but it resets the screen x/y position calculation for anything off-screen.

In my map, I load chunks of tiles within a radius around the player so some of the inner tiles will be off-screen (until they move around, tile positions on the screen are moved).

I tried a test with a tile that would be off screen:

Tile's x coord: 41
41 * 25 = 1025
Game window: 1024
1025 % 1024 = 1

This means that tile x (which, if the screen 0,0 is at map 0,0, should be at x:1025, just off the right-hand side of the screen) is actually at x:1, appearing in the top-left.

I can't think of how to properly handle this - it seems to me like I need take the tileX * tileWidth to determine it's "initial screen position" and then somehow use an offset to determine how to make it appear on screen. But what offset?

Update: I already store an x/y offset value when the player moves, so I know how to move the map. I can use these values as the current offset, and if someone saves the game I can simply store those and re-use them. There's no equation necessary, I would just have to store the cumulative offsets.

helion3
  • 34,737
  • 15
  • 57
  • 100

2 Answers2

1

The modulo (worldX*tileWidth % screenWidth) is what's causing it to reset. Modulo (%) gives you the remainder of an integer division operation; so, if worldX * tileWidth is greater than screenWidth, it will give you the remainder of (worldX * tileWidth) / screenWidth; if worldX * tileWidth is screenWidth+1, remainder is 1: it starts over at the beginning of the row.

If you eliminate the modulo, it will continue to draw tiles past the edge of the screen. If your drawing buffer is the same size as the screen, you'll need to add a check for tiles at the edge of the screen to make sure you only draw the tile portion that will be visible.

If you're trying to keep the player centered on the screen, you need to offset each tile by the player's offset from tile 0,0 in pixels, minus half the screen width:

offsetX = (playerWorldX * tileWidth) - (screenWidth / 2);
screenX = (worldX * tileWidth) - offsetX;
Adrian
  • 42,911
  • 6
  • 107
  • 99
-1
x = ((worldX*tileWidth) > screenWidth) ? worldX*tileWidth : (worldX*tileWidth)%screenWidth;

That should work. Though I recommend implementing something like an interface and letting each tile decide where they want to be rendered. Something like this

interface Renderable {


void Render(Graphics2D g)
  ..
}

class Tile implements Renderable{
  int x,y
   //other stuff


  Render(Graphics2D g){
   if (!inScreen()){
     return; 
    }
   //...
   //render
   }


  boolean inScreen(){
    //if the map moves with the player you need to define the boundaries of your current screenblock in terms of the global map coordinates
    //What you can do is store this globally in a singleton somewhere or pass it to the constructor of each tile.
    //currentBlock.x is then player.x - screenWidth/2
    //currentBlock.width is then player.x + screenWidth/2;
    //similar for y


    if(this.x < currentBlock.x || this.x > currentBlock.Width)
      return false;
    if (this.y < currentBlock.y || this.y > currentBlock.height)
      return false;
    return true;


   //If the map are in blocks (think zelda on snes where you go from one screenblock to another) you still need to define the boundaries
   //currentBlock.x = (player.x / screenWidth) (integer division) *screenWidth;
   //currentBlock.width = (player.x /screenWidth) (...) * screenWidth  + screenWidth;
   //same for y
   //Then perform above tests

}
arynaq
  • 6,710
  • 9
  • 44
  • 74
  • Your solution doesn't make much sense to me: if it's offscreen, let it draw offscreen, if it's on screen, do a modulo operation that you've already proven won't do anything? If worldX*tileWidth isn't > screenWidth, then worldX*tileWidth%screenWidth == worldX*tileWidth. The ternary is pointless, as with either path, you'll get worldX*tileWidth. – Adrian Jun 02 '13 at 19:41
  • That is true but if you scroll down two lines you see the alternate solutions, his question didn't really didn't answer whether his view was playercentric or mapcentric so I provided both alternatives. – arynaq Jun 02 '13 at 20:06