If you don't want to set up OpenCV in your iOS project, my open source GPUImage framework has two threshold filters within it for binarization of images, a simple threshold and an adaptive one based on local luminance near a pixel.
You can apply a simple threshold to an image and then extract a resulting binarized UIImage using code like the following:
UIImage *inputImage = [UIImage imageNamed:@"inputimage.png"];
GPUImageLuminanceThresholdFilter *thresholdFilter = [[GPUImageLuminanceThresholdFilter alloc] init];
thresholdFilter.threshold = 0.5;
UIImage *thresholdFilter = [thresholdFilter imageByFilteringImage:inputImage];
(release the above filter if not using ARC in your application)
If you wish to display this image to the screen instead, you can send the thresholded output to a GPUImageView. You can also process live video with these filters, if you wish, because they are run entirely on the GPU.