14

How to get the Pixel Data of UIimage after scaling and moving,I wan to get the CGImage Pixel Data and then get the UIImage from CGImage Pixel Data.How can we do this?.

user1362790
  • 189
  • 1
  • 2
  • 7

2 Answers2

26

You can get the rawdata by calling

CFDataRef rawData = CGDataProviderCopyData(CGImageGetDataProvider(aCGImageRef));

You can step through pixels like this:

UInt8 * buf = (UInt8 *) CFDataGetBytePtr(rawData); 
CFIndex length = CFDataGetLength(rawData);

for(unsigned long i=0; i<length; i+=4)
{
    int r = buf[i];
    int g = buf[i+1];
    int b = buf[i+2];
     }
CFRelease(rawData);
Demitri
  • 13,134
  • 4
  • 40
  • 41
Jonas Schnelli
  • 9,965
  • 3
  • 48
  • 60
  • 1
    How to get UIImage from this buffer back? – user1362790 May 02 '12 at 11:17
  • I have code like this CFDataRef imageData = CGDataProviderCopyData( CGImageGetDataProvider ( collisionImage.CGImage ) ); const UInt32 *pixels = (const UInt32*)CFDataGetBytePtr( imageData );,How can i use pixels to create an image again – user1362790 May 02 '12 at 11:24
  • I want to explain my question if anybody understands please reply me. m_cAppDelegatePtr.myFinalImageData = (NSData *) CGDataProviderCopyData(CGImageGetDataProvider(image)); UInt8 * buf = (UInt8 *) CFDataGetBytePtr(data); int length = CFDataGetLength(data); And then I am using like this imageEditPtr = [[UIImage alloc] initWithData: m_cAppDelegatePtr.myFinalImageData]; but i am not getting any image into imageEditPtr – user1362790 May 02 '12 at 11:48
0

Try this. This will give you Offset, Alpha, RGB Color etc of selected pixel of an image.

Source Code:-

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event 
{
UITouch *touch = [[event allTouches] anyObject]; 

CGPoint startLocation   = [touch locationInView:self.view];

    if ([touch view] == YourImageView)
    {
        CGPoint location = [touch YourImageView];
        [self getPixelColorAtLocation:location];
        if((alpha==255)||(alpha==0))
        {
            //perform Your task
        }
    }
} 


- (UIColor*) getPixelColorAtLocation:(CGPoint)point 
{

UIColor* color = nil;

CGImageRef inImage;

inImage = YourImageView.image.CGImage;


// Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue
CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage];
if (cgctx == NULL) { return nil; /* error */ }

size_t w = CGImageGetWidth(inImage);
size_t h = CGImageGetHeight(inImage);
CGRect rect = {{0,0},{w,h}};


// Draw the image to the bitmap context. Once we draw, the memory 
// allocated for the context for rendering will then contain the 
// raw image data in the specified color space.
CGContextDrawImage(cgctx, rect, inImage); 

// Now we can get a pointer to the image data associated with the bitmap
// context.
unsigned char* data = CGBitmapContextGetData (cgctx);
if (data != NULL) {
    //offset locates the pixel in the data from x,y. 
    //4 for 4 bytes of data per pixel, w is width of one row of data.
    int offset = 4*((w*round(point.y))+round(point.x));
    alpha =  data[offset]; 
    int red = data[offset+1]; 
    int green = data[offset+2]; 
    int blue = data[offset+3]; 
    NSLog(@"offset: %i colors: RGB A %i %i %i  %i",offset,red,green,blue,alpha);
    color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)];
}

// When finished, release the context
//CGContextRelease(cgctx); 
// Free image data memory for the context
if (data) { free(data); }

return color;
}

- (CGContextRef) createARGBBitmapContextFromImage:(CGImageRef)inImage 
{
CGContextRef    context = NULL;
CGColorSpaceRef colorSpace;
void *          bitmapData;
int             bitmapByteCount;
int             bitmapBytesPerRow;

// Get image width, height. We'll use the entire image.
size_t pixelsWide = CGImageGetWidth(inImage);
size_t pixelsHigh = CGImageGetHeight(inImage);

// Declare the number of bytes per row. Each pixel in the bitmap in this
// example is represented by 4 bytes; 8 bits each of red, green, blue, and
// alpha.
bitmapBytesPerRow   = (pixelsWide * 4);
bitmapByteCount     = (bitmapBytesPerRow * pixelsHigh);

// Use the generic RGB color space.
colorSpace = CGColorSpaceCreateDeviceRGB();

if (colorSpace == NULL)
{
    fprintf(stderr, "Error allocating color space\n");
    return NULL;
}

// Allocate memory for image data. This is the destination in memory
// where any drawing to the bitmap context will be rendered.
bitmapData = malloc( bitmapByteCount );
if (bitmapData == NULL) 
{
    fprintf (stderr, "Memory not allocated!");
    CGColorSpaceRelease( colorSpace );
    return NULL;
}

// Create the bitmap context. We want pre-multiplied ARGB, 8-bits 
// per component. Regardless of what the source image format is 
// (CMYK, Grayscale, and so on) it will be converted over to the format
// specified here by CGBitmapContextCreate.
context = CGBitmapContextCreate (bitmapData,
                                 pixelsWide,
                                 pixelsHigh,
                                 8,      // bits per component
                                 bitmapBytesPerRow,
                                 colorSpace,
                                 kCGImageAlphaPremultipliedFirst);
if (context == NULL)
{
    free (bitmapData);
    fprintf (stderr, "Context not created!");
}

// Make sure and release colorspace before returning
CGColorSpaceRelease( colorSpace );

return context;
}

Hope it helps. Thanks :)

Nikhil Bansal
  • 1,545
  • 13
  • 29
  • Actually i want entire pixels of the image,and i need to save the image into NSData or CFDataRef,then i will get the image from NSData and CFDataRef in another controller.If you know help me. – user1362790 May 03 '12 at 10:05
  • You can simply get the image from NSData using this :- NSData *imageData = UIImagePNGRepresentation(yourUIImage); uiImage = [UIImage imageWithData:imageData]; – Nikhil Bansal May 03 '12 at 10:14
  • but the scaled image we cannot save into NSData and cannot get the scaled image using NSData,if we store also original data is coming,i want scaled data – user1362790 May 03 '12 at 10:17
  • why not?? we can save scaled image into NSData!!! – Nikhil Bansal May 03 '12 at 10:28
  • i tried it but i didnot get,can you explain how to get the data?and store it for later reference. – user1362790 May 03 '12 at 10:58
  • first tell me you are getting your scaled image??? if yes then just do this:--- NSData *scaledImageData = UIImagePNGRepresentation(scaledUIImage); uiImage = [UIImage scaledImageData]; Try it!!! – Nikhil Bansal May 03 '12 at 11:42
  • i am not getting scaled image,how to get the scaled image? – user1362790 May 03 '12 at 12:01
  • i am scaling using transformation,but scaled image how to get? – user1362790 May 03 '12 at 12:01
  • look buddy u r scaling an image and then u must be saving that scaled image somewhere for sure!!! so that u can use that scaledImage at some other place of your project.get scaledimage from that path.U getting my point??? – Nikhil Bansal May 03 '12 at 12:07
  • actually i am taking screencapture method provided by apple to capture the scaled image,i didnt save data of image,if i would have saved the image it is easy to get the image from data – user1362790 May 03 '12 at 12:13