2

I have searched a lot and have tried every single method that I could find in StackOverflow and none of them has worked. let's say I have a UIView which has a UIImageView containing a UIImage with a size of 1800x2300, this is obviously bigger than a size of iPhone 7/8's screen so even using Screen scale won't help to render this View which contains this Image.And yes I have tried https://stackoverflow.com/questions/4334233/how-to-capture-uiview-to-uiimage-without-loss-of-quality-on-retina-display too, I want a render bigger than my retina screen size and it doesn't do it for me. These are the ways I tried to do it, they won't render anything bigger than the size of my imageRect * Scale of my screen

    // Approach 1
    @autoreleasepool{
        //imageRect is a CGRect which is the rect I defined for my 
        //UIImageView to be aspect fitted in it.
        //_viewWithLoadedImages is the UIView containing two transparent 
        //UIImageViews on top of each other, each containing images 
        //bigger than screen size of iPhones.

        UIGraphicsBeginImageContextWithOptions(imageRect.size,NO,0);
        CGContextRef c = UIGraphicsGetCurrentContext();
        CGContextConcatCTM(c, CGAffineTransformMakeTranslation(-imageRect.origin.x, -imageRect.origin.y));
        [_viewWithLoadedImages.layer renderInContext:c];
        UIImage*renderedImage = 
        UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        return renderedImage;
    }

    //Approach 2
    @autoreleasepool{
        UIGraphicsImageRenderer * Renderer = [[UIGraphicsImageRenderer alloc] initWithBounds:imageRect];
        self.tempImage=[Renderer imageWithActions:^(UIGraphicsImageRendererContext*_Nonnull context){[_viewWithLoadedImages.layer renderInContext:context.CGContext];}];
            return self.tempImage;
    }

    //Approach 3 
    UIGraphicsBeginImageContextWithOptions(imagerect, NO, 0.0f);
    [_viewWithLoadedImages drawViewHierarchyInRect:imagerect afterScreenUpdates:YES];
    UIImage * snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    return snapshotImage;

They all have output sizes that equal to the size of the ImageRect * ScreenScale but I want a bigger output image from my UIView (_viewWithLoadedImages) What should I do? I have tried almost everything. Any help is much appreciated.

Reza.Ab
  • 1,195
  • 1
  • 11
  • 21

1 Answers1

1

Assuming you have two UIImageView's and can use one of them to size the final image, you can do like this.

// Desired dimension in pixels
CGRect frame = CGRectMake(0.0, 0.0,
                          _imageView1.image.size.width * _imageView1.image.scale,
                         _imageView1.image.size.height * _imageView1.image.scale);

// Use scale 1.0 for pixel size
UIGraphicsBeginImageContextWithOptions(frame.size, NO, 1.0);

// Draw the images on top of each other
[_imageView1.image drawInRect:frame];
[_imageView2.image drawInRect:frame];

UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

return image;
LGP
  • 4,135
  • 1
  • 22
  • 34
  • The images are already existing in the UIView I want to render, So there won't be any need for drawInRect... right? but I tried it, doesn't work – Reza.Ab Jan 18 '18 at 13:10
  • 1
    What exactly does not work? I tried it myself, and it will render the images at full resolution. – LGP Jan 18 '18 at 13:37
  • I'm not rendering it in a drawRect method, I think that's why `[_imageView1.image drawInRect:frame]; [_imageView2.image drawInRect:frame];` are not working here. – Reza.Ab Jan 18 '18 at 14:02
  • 1
    It does work. `UIGraphicsBeginImageContextWithOptions` creates a new current context, and `drawInRect` will draw in that context. I tested it from a button push event, not in a drawRect. – LGP Jan 18 '18 at 14:25
  • Is there any way to do it without drawing images like that? i mean imagine for each uiview i need to re-draw all elements bigger like that, is there a way to let's say render the uiview the way it is with higher depth? because imageviews already do have high quality pics in them.... I'd truly appreciate it if It could be done that way – Reza.Ab Jan 18 '18 at 16:13
  • I mean the problem I have now is that should i now define another frame for every ImageView that's not showing full screen in my UIview ? for example If i have a small uiimageView in lower right bottom of my image which i wanted to be rendered too, what should i do with its frame? – Reza.Ab Jan 18 '18 at 17:17
  • 1
    Not sure I see the real problem. If I understand you correctly, you have various views that may contain composite images scaled down. You then want to get an image representation in full resolution of such a view to use somewhere else? If so, then this is a way to do it. If you want a more generic way to do it, then you could put it in a method with the parent view as a parameter, and let the method loop through all image views in it to create the resulting image. – LGP Jan 18 '18 at 19:51
  • I Fixed it with manually calculating frames for each UIImageView existing in my UIView relative to the size of the fullscreen imageView that I had so they all had to be drawn at their new specific frames, It worked, Thank you so so much. @LGP – Reza.Ab Jan 19 '18 at 16:57
  • What if I want to render _bgImageView.layer.mask with this using the same method using the same frame? i couldn't do `[_bgImageView.layer.mask drawInRect:frame];' Since it doesnt have a drawInRect method@LGP ` – Reza.Ab Jan 23 '18 at 17:22
  • 1
    You can always use `[_bgImageView.layer.mask drawInContext:context];`. Get the context by using `CGContextRef context = UIGraphicsGetCurrentContext();`. If that doesn't work, you can apply the mask to each of the sub-views before rendering them. – LGP Jan 23 '18 at 20:19
  • Right but using drawInContext:context.. I want them to be drawn on top of each other and then being masked and rendered at full resolution of the image. I do it this way but its low quality: `_bgImageView = [[UIImageView alloc]initWithImage:_fgImgWithEffect]; [_bgImageView setFrame:BGFrame]; _fgImageView=[[UIImageView alloc]initWithImage:_selectionCropAndBlurredImage]; [_fgImageView setFrame:FGFrame]; _bgImageView.layer.mask=_fgImageView.layer;` and to render, i do this which won't bigger resolution than scale of device: @LGP – Reza.Ab Jan 24 '18 at 02:02
  • I'm trying to do the same approach you suggested above using `drawInRect:frame` for uiimages, but for this masking situation `_bgImageView.layer.mask=_fgImageView.layer` in a big resolution, I'd really appreciate it. @LGP – Reza.Ab Jan 24 '18 at 02:05
  • 1
    You need to apply the mask to you context. I think this blend mode should do it. Don't forget to change it back if you do any drawing after that. `CGContextSetBlendMode(context, kCGBlendModeDestinationIn); ` `[_bgImageView.layer.mask drawInContext:context];` – LGP Jan 24 '18 at 09:01
  • one quick question, image.size.width actually returns the pixel size of the image, why would we ever need to multiply that to scale? I think that's what we had to do in early days, right? @LGP – Reza.Ab Feb 19 '18 at 15:50
  • 1
    Well, not quite. The documentation for `UIImage:size` says `This value reflects the logical size of the image and takes the image’s current orientation into account. Multiply the size values by the value in the scale property to get the pixel dimensions of the image.`. Also, as you see, I set `UIGraphicsBeginImageContextWithOptions` last parameter scale to 1.0 to get a non-scaled bitmap. – LGP Feb 19 '18 at 16:08
  • You are right but !!! I have an image that i already checked it's size, its 4300X2800 and the NSLog(@"image.size %@",NSSStringFromCGSize(image.size)); shows 4300X2800 !!! then how is that in points?! and im using iphone 6 so it should show the half of that size, right? but it doesnt – Reza.Ab Feb 19 '18 at 16:10
  • 1
    If the scale of the image is 1.0 it works out. Note that the `scale` of an `UIImage` is independent of the device scale. Although your iPhone 6 has a retina display with a screen scale of 2.0, you can have images of other scales. Print out the `image.scale` also and see what it says. – LGP Feb 19 '18 at 16:15
  • Oh my god thank you!!! so thats why some images show incorrect sizes in nslog!? So I have to check the scale of image and if it is 1, then i won't need to multiply it by the scale of device in nslog? And if its not I have to multiply!? Is there any library that you know it'd do it? Cuz I have so many rendering going on based on size of images – Reza.Ab Feb 19 '18 at 16:18
  • 1
    Im gonna create a new question based on this Issue. – Reza.Ab Feb 19 '18 at 16:24
  • 1
    In fear of giving bad advice I'd say that it depends on what you are doing. If the `UIImage` is correctly constructed you will get the *pixel dimension* by multiplying its size by its scale. This is _independent_ of your device scale. Likewise, as long as the `UIImage` is correct, UIKit will take this into account when doing its things. – LGP Feb 19 '18 at 16:25
  • I see, Im gonna ask it in a comprehensive way in my new question since there are three issues. 1. Showing the true (pixel) size of an image considering the scale of each image. 2. After rendering it, making sure it's scale hasn't changed by using [UIImage imageWithCGImage:renderedimage.CGImage scale:scale orientation:imageWithCGImage:renderedimage.imageOrientation]; and 3. Loading it with it's correct scale so it wont show multiple times bigger with it's small resolution and vice versa. – Reza.Ab Feb 19 '18 at 16:29
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/165433/discussion-between-reza-ab-and-lgp). – Reza.Ab Feb 19 '18 at 16:59
  • Still work hours here, give me some time to get back! – LGP Feb 19 '18 at 17:23