1

How can I achieve the output in iOS using Objective-C .. Image-1 + Image-2 = Image-3 ?

Image-1

Image-2

Image -3

How can I achieve the output in iOS using Objective-C .. Image-1 + Image-2 = Image-3 ?

emraz
  • 1,563
  • 20
  • 28
  • Try this http://stackoverflow.com/questions/37495730/how-to-blend-two-images – Codobux May 15 '17 at 13:08
  • http://stackoverflow.com/questions/18273271/merge-two-image-on-to-one-image-programmatically-in-iphone?rq=1 check this , but it combined 2 images vertically , You can update rect. That might help you ! – MOHAMMAD ISHAQ May 15 '17 at 13:08
  • Well it seems some one gave down vote to the question. Can you please explain the reason to give the down vote? Im really interested to know that. – emraz May 15 '17 at 16:45

1 Answers1

5
//: Playground - noun: a place where people can play

import UIKit
import CoreImage

func aspectFill(from: CGRect, to: CGRect) -> CGAffineTransform {
    let horizontalRatio = to.width / from .width
    let verticalRatio = to.height / from.height
    let scale = max(horizontalRatio, verticalRatio)
    let translationX = horizontalRatio < verticalRatio ? (to.width - from.width * scale) * 0.5 : 0
    let translationY = horizontalRatio > verticalRatio ? (to.height - from.height * scale) * 0.5 : 0
    return CGAffineTransform(scaleX: scale, y: scale).translatedBy(x: translationX, y: translationY)
}

func filter(image: UIImage, texture: UIImage) -> UIImage? {
    guard let imageCI = CIImage(image: image),
        let textureCI = CIImage(image: texture)
        else {
            return nil
    }

    let scaleFillTextureCI = textureCI.applying(aspectFill(from: textureCI.extent, to: imageCI.extent))
    let crop = CIFilter(name: "CICrop")!
    crop.setValue(scaleFillTextureCI, forKey: "inputImage")
    crop.setValue(imageCI.extent, forKey: "inputRectangle")

    let alpha = CIFilter(name: "CIConstantColorGenerator")!
    alpha.setValue(CIColor.init(red: 0, green: 0, blue: 0, alpha: 0.7), forKey: "inputColor")

    let mix = CIFilter(name: "CIBlendWithAlphaMask")!
    mix.setValue(imageCI, forKey: "inputImage")
    mix.setValue(crop.outputImage, forKey: "inputBackgroundImage")
    mix.setValue(alpha.outputImage, forKey: "inputMaskImage")

    let blend = CIFilter(name: "CIBlendWithMask")!
    blend.setValue(imageCI, forKey: "inputImage")
    blend.setValue(mix.outputImage, forKey: "inputBackgroundImage")
    blend.setValue(imageCI, forKey: "inputMaskImage")

    let context = CIContext(options: nil)
    guard let ciImage = blend.outputImage,
        let cgImage = context.createCGImage(ciImage, from: ciImage.extent) else {
            return nil
    }

    return UIImage(cgImage: cgImage)
}

let image = #imageLiteral(resourceName: "image.jpg")
let texture = #imageLiteral(resourceName: "texture.jpg")
let output = filter(image: image, texture: texture)]

I have solution in Swift since I am not familiar with Objective-C syntax, but I think you can translate to Objective-C easily. You can achieve the effect by using CoreImage. Here the result from Xcode Playground. Screenshot

Morty Choi
  • 2,466
  • 1
  • 18
  • 26