I've recently inherited a project for the iPad which is basically an image/presentation viewer. After selecting the presentation the user wishes to view, the app basically shows the images, one by one, changing when the user swipes up or down, or when they tap one of the arrows in the corners of the screen. However, some of the images in the presentation have the pagination dots, with arrows on either side of them. The images are in order, and as you swipe from one to the next, the dot moves. What our client would like is to be able to use the dots and arrows, which are part of the static image, for navigation. Meaning when they hit a dot, it takes them to the appropriate page.
The early, early, original version of this application had a huge plist file with information on all the images in the app, including if there were any of these carousel views, and where they would go. This has been long gone now, and there are many, many more images/presentations in the app than there were then.
Is there a better way to determine where to listen for touch? Or should I resign myself to writing a text file, and knowing I'd have to edit that when the content changes? I've been told that the content shouldn't change that often.