4

I am trying to implement the Pikachu image on the hoarding board using warpPerspective transformation. The output doesn't have smooth edges it has dotted points instead.

import cv2
import numpy as np

image = cv2.imread("base_img.jpg")


h_base, w_base = image.shape[0], image.shape[1]

white_subject =  np.ones((480,640,3),dtype="uint8")*255
h_white, w_white = white_subject.shape[:2]

subject = cv2.imread('subject.jpg')


h_sub, w_sub = subject.shape[:2]

pts2 = np.float32([[109,186],[455,67],[480,248],[90,349]])
pts3 = np.float32([[0, 0], [w_white, 0], [w_white, h_white], [0, h_white]])

transformation_matrix_white = cv2.getPerspectiveTransform(pts3, pts2)
mask = cv2.warpPerspective(white_subject, transformation_matrix_white, (w_base, h_base)) 
image[mask==255] = 0

pts3 = np.float32([[0, 0], [w_sub, 0], [w_sub, h_sub], [0, h_sub]])
transformation_matrix = cv2.getPerspectiveTransform(pts3, pts2)
warped_image = cv2.warpPerspective(subject, transformation_matrix, (w_base, h_base)) 

Hoarding board image

enter image description here

Pikachu Image

enter image description here

Output Image

enter image description here

Pattern Image

enter image description here

Output Image enter image description here

Please help me out in getting the output without the dotted point at the edges.

Monk247uk
  • 1,170
  • 1
  • 8
  • 15
  • 1
    I dont get how you combine the warped subject with the background. Are you just copying with the mask? – Micka Dec 14 '21 at 17:10
  • I am using the transformation_matrix_white to create the mask image. With help of mask image I have overlayed the Pikachu image on the hoarding board. – Monk247uk Dec 14 '21 at 18:46
  • You need to anti-alias your mask by blurring the edges. See for example how I do that at https://stackoverflow.com/questions/63001988/how-to-remove-background-of-images-in-python/63003020#63003020 or https://stackoverflow.com/questions/64208431/how-to-remove-visible-background-boundary-around-object-after-saliency-detection/64216970#64216970 using Gaussianblur and skimage.exposure.rescale_intensity() – fmw42 Dec 14 '21 at 18:46
  • Here what I mean with smooth edges is like the output should have edges like original image edges. – Monk247uk Dec 14 '21 at 18:49
  • You are seeing the dotted black pixels because your background color from the warpPerspective defaults to 0. After your warpPerspective, change all black pixels to the same yellow background color of the Pikachu image – fmw42 Dec 14 '21 at 18:50
  • I am saying to blur the edges of the mask and use that to composite the warped Pikachu image over your background. Also change any black in the warped Pikachu image to the same yellow background color before masking. – fmw42 Dec 14 '21 at 18:51
  • By changing the black to yellow color may work but what if subject image have multiple colors. Like can we get more generalized solution. – Monk247uk Dec 14 '21 at 18:58
  • That is why one blurs the mask to anti-alias the outline when compositing – fmw42 Dec 14 '21 at 19:47
  • Use flags=cv2.INTER_NEAREST for both: mask and subject warping so you wont have pixels blended between subject and black. This will remove the wrong colors. You wont get a smooth edge then, though. For a smooth edge you will have to use anti-aliasing and interpolation between your subject and the background – Micka Dec 14 '21 at 20:16
  • 1
    If you do not have a constant color, then use cv.BORDER_REFLECT for the borderConstant in warpPerspective. That may help alleviate the black dotted outline. – fmw42 Dec 14 '21 at 21:38
  • I used the both cv..BORDER_TRANSPARENT and cv.BORDER_REFLECT still the border has a black dotted outline for the second pattern. – Monk247uk Jan 11 '22 at 04:37

1 Answers1

7

Here is one way to do the anti-aliased composite in Python/OpenCV. Note that I use the background color of the overlay image in the borderVal constant in warpPerspective to set the background color, since it is a constant. I also blur the mask before doing the composite.

Background Image:

enter image description here

Overlay Image:

enter image description here


import cv2
import numpy as np
import skimage.exposure

image = cv2.imread("base_img.jpg")
h_base, w_base = image.shape[0], image.shape[1]

white_subject =  np.ones((480,640,3),dtype="uint8")*255
h_white, w_white = white_subject.shape[:2]

subject = cv2.imread('subject.jpg')
h_sub, w_sub = subject.shape[:2]

# get background color from first pixel at (0,0) and its BGR components
yellow = subject[0:1, 0:1][0][0]
blue = yellow[0]
green = yellow[1]
red = yellow[2]
print(yellow)
print(blue, green, red)

pts2 = np.float32([[109,186],[455,67],[480,248],[90,349]])
pts3 = np.float32([[0, 0], [w_white, 0], [w_white, h_white], [0, h_white]])

transformation_matrix_white = cv2.getPerspectiveTransform(pts3, pts2)
mask = cv2.warpPerspective(white_subject, transformation_matrix_white, (w_base, h_base)) 

pts3 = np.float32([[0, 0], [w_sub, 0], [w_sub, h_sub], [0, h_sub]])
transformation_matrix = cv2.getPerspectiveTransform(pts3, pts2)
# do warping with borderVal = background color
warped_image = cv2.warpPerspective(subject, transformation_matrix, (w_base, h_base), borderMode = cv2.BORDER_CONSTANT, borderValue=(int(blue),int(green),int(red))) 

# anti-alias mask
mask = cv2.GaussianBlur(mask, (0,0), sigmaX=2, sigmaY=2, borderType = cv2.BORDER_DEFAULT)
mask = skimage.exposure.rescale_intensity(mask, in_range=(0,128), out_range=(0,255))

# convert mask to float in range 0 to 1
mask = mask.astype(np.float64)/255

# composite warped image over base and convert back to uint8
result =  (warped_image * mask + image * (1 - mask))
result = result.clip(0,255).astype(np.uint8)

# save results
cv2.imwrite('warped_mask.png',(255*mask).clip(0,255).astype(np.uint8))
cv2.imwrite('warped_image.png',warped_image)
cv2.imwrite('warped_image_over_background.png',result)

cv2.imshow("mask", mask)
cv2.imshow("warped_image", warped_image)
cv2.imshow("result", result)
cv2.waitKey(0)

Anti-aliased Warped Mask:

enter image description here

Warped Image:

enter image description here

Resulting Composite:

enter image description here

fmw42
  • 46,825
  • 10
  • 62
  • 80