I am developing an iOS application which contains a scalable and pannable UIView
. Since the user is allowed to pan and scale the view, I did like to keep the UIView
within the boundaries of the screen.
I searched a lot on the internet, but there does not seem to be examples that really fits my need.
Below is the panning code I wrote:
private void HandlePan(UIPanGestureRecognizer recognizer)
{
if (recognizer.State != UIGestureRecognizerState.Began &&
recognizer.State != UIGestureRecognizerState.Changed)
return;
var translation = recognizer.TranslationInView(this);
_posX += translation.X;
_posY += translation.Y;
var maxX = (Bounds.Size.Width / 2) * _currentScale;
var maxY = (Bounds.Size.Height / 2) * _currentScale;
// TODO: The min values are wrong
var minX = (Bounds.Size.Width / 2) / _currentScale;
var minY = (Bounds.Size.Height / 2) / _currentScale;
if (_posX > maxX)
_posX = maxX;
else if (_posX < minX)
_posX = minX;
if (_posY > maxY)
_posY = maxY;
else if (_posY < minY)
_posY = minY;
var translatedCenter = new CGPoint(_posX, _posY);
Center = translatedCenter;
recognizer.SetTranslation(CGPoint.Empty, this);
}
I managed to get the boundaries working with only two sides of the screen. maxX
and maxY
are correct. I just can't figure out the way on how to calculate the correct minX
and minY
values.
Below I added a screen recording to show what is going wrong:
You can see that the coordinates are blocking (even more when the scale is incrementing > 1) when I try to drag to the right/bottom side of the image (which are the minX
and minY
values).
What is wrong in this calculation or what should the correct calculation be? Note that maxX
and maxY
are perfectly working.
var minX = (Bounds.Size.Width / 2) / _currentScale;
var minY = (Bounds.Size.Height / 2) / _currentScale;