2

I am trying to create pip effect camera in ios using swift 4,for that I have taken the following controls: View FrontCameraView(ImageView) BackCameraView(ImageView) MaskedCameraView(ImageView)

FrontCameraView takes the blur image,Frame Image takes frames of pip and maskedCameraView takes masked images.

Now,when I capture the image

This is to apply live effects on custom camera:-

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        connection.videoOrientation = orientation
        let videoOutput = AVCaptureVideoDataOutput()
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)

        let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
        let cameraImage = CIImage(cvImageBuffer: pixelBuffer!)
        let cgImage:CGImage = context.createCGImage(cameraImage, from: cameraImage.extent)!
        globalImage = UIImage.init(cgImage: cgImage)
        let croppedImage = HelperClass.shared().imageByScalingAndCropping(forSize:globalImage, CGSize.init(width:200, height: 200))
        DispatchQueue.main.async {
            self.backCameraView?.image = HelperClass.shared().blur(withCoreImage:croppedImage, andView:self.view)
            self.frontCameraView?.image = frameArray[self.tagValue]
            let maskedImage = self.maskImage(image:(croppedImage)!, mask:maskedArray[self.tagValue])
            let maskedcroppedImage = HelperClass.shared().imageByScalingAndCropping(forSize:maskedImage, CGSize.init(width:200, height: 200))
            self.maskedCameraView.image = maskedcroppedImage

        }
    }

//code for capturing image @IBAction func takePhoto(_ sender: UIButton) {

        if(captureSession.canAddOutput(videoOutput)){
            captureSession.addOutput(videoOutput)
        }
        let videoConnection = videoOutput.connection(with: AVMediaType.video)
        if(videoConnection != nil){
            photoOutput?.captureStillImageAsynchronously(from:videoConnection!, completionHandler: { (sampleBuffer, error) in

                if sampleBuffer != nil{

                    let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer!)
                    let dataProvider = CGDataProvider.init(data:imageData! as CFData)
                    let cgImageRef = CGImage.init(jpegDataProviderSource:dataProvider!, decode:nil, shouldInterpolate: true, intent: CGColorRenderingIntent.relativeColorimetric)
                    let image = UIImage.init(cgImage: cgImageRef!, scale:1.0, orientation:UIImageOrientation.right)


                    let sourceImage = HelperClass.shared().imageByScalingAndCropping(forSize:image,CGSize.init(width:200, height:200))

                    let storyBoard : UIStoryboard = UIStoryboard(name: "Main", bundle:nil)
                    let framePreviewVC = storyBoard.instantiateViewController(withIdentifier: "FramePreviewController") as! FramePreviewController;                    framePreviewVC.frameImage = sourceImage!
                    self.navigationController?.pushViewController(framePreviewVC, animated: true)
                }
            })
        }
    }

Issue:-I am able to implement PIPeffect using this code,but when I try to capture image using videoOutput(AVCaptureVideoDataOutput),it doesn't enable me to capute image. And if I do the same thing using photoOutput(AVCaptureStillImageOutput),it doesn't let me to apply PIP effect on live camera.Kindly help me to resolve this issue,I have been stuck on this point for a week. Any kind of guidance in this direction would be highly appreciated.Thanks in advance!

Pratz_iOS
  • 67
  • 1
  • 8
  • The question is not very clear. Are you trying to use the front camera and the back camera at the same time? because you cannot do that – Scriptable Apr 30 '18 at 13:11
  • I am just working on back camera.Maybe names of image view are confusing you...I have mentioned the details regarding each name before use . – Pratz_iOS Apr 30 '18 at 13:33

0 Answers0