I have an OpenGL ES2-based app which runs smoothly at 30fps on an iPad 2, but on an iPad 1 it's a bit jerky. I want to modify my app to use a default frame rate of 20fps on the iPad 1, which I've already verified makes it feel much smoother on that model.
What's a good way to detect the iPad 1's lower performance? Should I just look for more than one CPU core (and how do I detect that) or maybe the processor speed or total system memory? I know it's bad to look at device strings so I'm avoiding that. I've considered having my drawing code simply detect that it isn't keeping up with the frame rate and throttling it back, but that has complications I'd rather avoid (ie, falling back on an iPad 2 just because of a transient load spike, then having to add even more code to re-try the higher frame rate just in case that happens).