I have a game object in unity, that has scale.x = 5
and scale.y = 5
(since I'm working in 2D, the z
component is irrelevant). What would be the best way to decrease the 5 to 1 over exactly 600 milliseconds? I've seen answers using time.deltaTime
to change values per unit time (e.g. every 0.1 seconds), but this would be too imprecise. This would effectively mean decreasing it by 2/3 every 100ms, making it look as if it was stuttering. Is there a way to do this such that the game object's scale smoothly goes from 5 to 1 in 600ms?
Edit: Used exactly the code from the linked duplicate. I tried using it like this:
bool isRunning;
void Start ()
{
isRunning = false;
StartCoroutine(ScaleIn(5, 4, 0.6f));
}
IEnumerator ScaleIn(float currentScale, float scaleLoss, float duration)
{
if (isRunning == true)
{
yield break;
}
isRunning = true;
float counter = 0f;
float startScale = currentScale;
float endScale = currentScale - scaleLoss;
float newScale = currentScale;
while (counter < duration)
{
counter += Time.deltaTime;
if (counter < duration)
{
Debug.Log("condition met");
}
Debug.Log("Counter: " + counter);
newScale = Mathf.Lerp(startScale, endScale, counter / duration);
Debug.Log("Current Scale: " + newScale);
yield return null;
}
gameObject.transform.localScale = new Vector3(newScale, newScale, 0f);
}
condition met
prints in the console every time i run it, but exactly once. The log is telling me that newScale
is 4.866666... and only prints this once. Even without the final line, newScale
does not seem to be changing past the first time when it becomes 4.866666...