3

The following command:

game.setScreen(new GameScreen());

launches you into a new screen, similar to Android's startActivity().

But then how do you leave the screen and return the screen that called you (similar to Android's finish())?

Plus, is there a graphic showing the screen lifecycle for LibGDX similar to Android?

honk
  • 9,137
  • 11
  • 75
  • 83
NewDev
  • 257
  • 1
  • 5
  • 18

1 Answers1

10

The screen lifecycle is actually pretty much the same like Android's lifecycle, because that's what they had to cover when designing libgdx. Basically the Android lifecycle callback events are just forwarded to LibGDX's ApplicationListener, which in turn forwards it to your Game, which in turn forwards it to your Screen.

The lifecycle usually looks like this (using Screen terminology):

           __________________________________
           |         ____       ____        |
           V         V   |      V  |        |
show --> resume --> resize <-- render --> pause --> hide --> dispose
           |          |          ^          ^
           |__________|__________|__________|

You can see that show and hide are usually only called once. show() will be called at the beginning, when your Screen is set as the current one, hide() will be called, when you change the screen. Note that dispose() is not alled automatically, so you should make sure to call it when switching the screen, or call it in your hide() method.

resume() and pause() can be called multiple times, but at least once. Switching to another app or the homescreen will cause one more pause -> resume cycle.

render() and resize() are usually called a lot, but not necessary in any particular order. Resizing the window on desktop can cause many calls to resize() in a row, without any render() call in between. But of course resize() could also be skipped completely.

If you want to switch back to a screen which was already visible before, then you need to give the second screen a reference to the first one, so it can be set as the current screen again. But that would also cause a whole lifecycle from the beginning.

Another option would be to keep the second screen as a property of the first screen and "emulate" the screen switch yourself, by calling screen2.show(); screen2.resume(); yourself, and then forward all events to the second screen in your first one.

noone
  • 19,520
  • 5
  • 61
  • 76
  • 1
    Thanks Just to be clear show() is called she you do the setScreen() right ? – NewDev Jan 11 '14 at 17:47
  • @NewDev Yes, exactly. – noone Jan 11 '14 at 19:58
  • How do you call the dispose() I'm getting a crash when I call it ? I setScreen() back to the splash screen, then detect if I have just returned from a game screen, the call the game screen's dispose() I trigger a set fault Thanks again – NewDev Jan 11 '14 at 21:27
  • @NewDev Sorry, I did not understand that. Please open a new question and add the necessary code to it. – noone Jan 11 '14 at 21:31
  • done, http://stackoverflow.com/questions/21068345/exit-screen-causing-java-native-memory-to-grow – NewDev Jan 11 '14 at 22:14
  • @noone What if I want to hide all screens and go back to game lifecycle? – Arda Kara Aug 09 '15 at 15:36
  • resume() doesn't seem to be called after show(). Only after resuming...! – JohnyTex Jan 24 '16 at 20:18