Is it possible in objective C that we can take the screen shot of screen and stored this image in UIImage.
-
1possible duplicate http://stackoverflow.com/questions/2214957/how-do-i-take-a-screen-shot-of-a-uiview – AndersK Aug 31 '10 at 15:35
-
1possible duplicate of [How to take a screenshot programmatically](http://stackoverflow.com/questions/2200736/how-to-take-a-screenshot-programmatically) – Brad Larson Sep 01 '10 at 16:03
11 Answers
The previous code assumes that the view to be captured lives on the main screen...it might not.
Would this work to always capture the content of the main window? (warning: compiled in StackOverflow)
- (UIImage *) captureScreen {
UIWindow *keyWindow = [[UIApplication sharedApplication] keyWindow];
CGRect rect = [keyWindow bounds];
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[keyWindow.layer renderInContext:context];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}

- 2,039
- 15
- 18
-
1I animate graphics in the view but this code ignores animation (not removed after completed) – Tibidabo Jan 05 '12 at 05:30
-
-
2That doesn't work: the status bar is not saved, and one of my view is an OpenGL view, its content isn't saved either. – Jean-Denis Muys May 30 '13 at 06:08
You need to create a bitmap context of the size of your screen and use
[self.view.layer renderInContext:c]
to copy your view in it. Once this is done, you can use
CGBitmapContextCreateImage(c)
to create a CGImage from your context.
Elaboration :
CGSize screenSize = [[UIScreen mainScreen] applicationFrame].size;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGContextRef ctx = CGBitmapContextCreate(nil, screenSize.width, screenSize.height, 8, 4*(int)screenSize.width, colorSpaceRef, kCGImageAlphaPremultipliedLast);
CGContextTranslateCTM(ctx, 0.0, screenSize.height);
CGContextScaleCTM(ctx, 1.0, -1.0);
[(CALayer*)self.view.layer renderInContext:ctx];
CGImageRef cgImage = CGBitmapContextCreateImage(ctx);
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CGContextRelease(ctx);
[UIImageJPEGRepresentation(image, 1.0) writeToFile:@"screen.jpg" atomically:NO];
Note that if you run your code in response to a click on a UIButton, your image will shows that button pressed.

- 9,037
- 3
- 34
- 50
-
-
Just wanted to mention that you missed releasing the colorSpaceRef – Patrick Hernandez May 21 '12 at 20:07
-
1Also, what is `self`? What about window transforms? Do you know what will happen when keyboard/alerts are shown? – Sulthan Jul 12 '13 at 11:34
Technical Q&A QA1703 Screen Capture in UIKit Applications
http://developer.apple.com/iphone/library/qa/qa2010/qa1703.html

- 2,039
- 15
- 18
try this...
- (UIImage*)captureView:(UIView *)view
{
CGRect rect = [[UIScreen mainScreen] bounds];
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[view.layer renderInContext:context];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
- (void)saveScreenshotToPhotosAlbum:(UIView *)view
{
UIImageWriteToSavedPhotosAlbum([self captureView:self.view], nil, nil,nil);
}

- 1,493
- 1
- 12
- 21

- 808
- 11
- 15
UIGraphicsBeginImageContext(self.window.bounds.size);
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
[data writeToFile:@"foo.png" atomically:YES];
UPDATE April 2011: for retina display, change the first line into this:
if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)])
{
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
}
else
{
UIGraphicsBeginImageContext(self.window.bounds.size);
}

- 1,493
- 1
- 12
- 21

- 1,522
- 13
- 21
// 100% Work
- (UIImage *)screenshot
{
UIGraphicsBeginImageContextWithOptions(self.main_uiview.bounds.size, NO, 2.0f);
[self.main_uiview drawViewHierarchyInRect:_main_uiview.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}

- 1,288
- 11
- 21
- (void)SnapShot {
if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(self.view.bounds.size);
}
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *data = UIImagePNGRepresentation(image);
[data writeToFile:@"snapshot.png" options:NSDataWritingWithoutOverwriting error:Nil];
[data writeToFile:@"snapshot.png" atomically:YES];
UIImageWriteToSavedPhotosAlbum([UIImage imageWithData:data], nil, nil, nil);
}

- 7,167
- 4
- 44
- 68

- 121
- 1
- 9
-
1. You have double writing to file. 2. Why do you need `[UIImage imageWithData:data]` if you already have image object above. 3. Camel case – Alex Nazarov Nov 21 '18 at 19:29
Use this code to take the screenshot:
-(void)webViewDidFinishLoad:(UIWebView *)webView
{
UIGraphicsBeginImageContext(self.webView.bounds.size);
[self.webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(screenshotImage, nil, nil, nil);
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *str1=[NSString stringWithFormat:@"%d",myInt];
NSString *pngFilePath = [NSString stringWithFormat:@"%@/page_%@.png",docDir,str1]; // any name u want for image
NSLog(@"%@",pngFilePath);
NSData *data1 = [NSData dataWithData:UIImagePNGRepresentation(screenshotImage)];
[data1 writeToFile:pngFilePath atomically:YES];
}

- 2,710
- 4
- 24
- 27
-
for what do you need `data1`, `UIImagePNGRepresentation` it's already NSData – Alex Nazarov Nov 21 '18 at 19:32
(UIImage *)screenshot { UIGraphicsBeginImageContextWithOptions(self.main_uiview.bounds.size, NO, 2.0f); [self.main_uiview drawViewHierarchyInRect:_main_uiview.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext();
return image; }

- 214
- 3
- 15
In modern way:
Obj-C
@interface UIView (Snapshot)
- (UIImage * _Nullable)snapshot;
@end
@implementation UIView (Snapshot)
- (UIImage * _Nullable)snapshot {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, UIScreen.mainScreen.scale);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
@end
Swift
extension UIView {
func snapshot() -> UIImage? {
UIGraphicsBeginImageContextWithOptions(bounds.size, false, UIScreen.main.scale)
drawHierarchy(in: bounds, afterScreenUpdates: true)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}

- 1,178
- 8
- 16
Yes, here's a link to the Apple Developer Forums https://devforums.apple.com/message/149553

- 1,123
- 9
- 19