0

This is in relation to this post Save AcquireCameraImageBytes() from Unity ARCore to storage as an image

I tried the steps mentioned by @JordanRobinson I am having a similar issue of seeing just a gray square. I keep re-reading his update, and I am not clear how step 2 (creating a texture reader) ties to step 3. I added the update function to call Frame.CameraImage.AcquireCameraImageBytes. I think missing something.

I feel I am close as it is saving an image (just a gray nothing image :-) Any help you can offer will be greatly appreciated

Here is my code

private Texture2D m_TextureRender; 
private TextureReader m_CachedTextureReader;  



void Start ()
    {
        m_CachedTextureReader = GetComponent<TextureReader>();
        m_CachedTextureReader.OnImageAvailableCallback += OnImageAvailable;
        QuitOnConnectionErrors ();
    }

void Update () {

    Screen.sleepTimeout = SleepTimeout.NeverSleep;

    using (var image = Frame.CameraImage.AcquireCameraImageBytes())
    {
        if (!image.IsAvailable)
        {
            return;
        }

        OnImageAvailable(TextureReaderApi.ImageFormatType.ImageFormatColor,
            image.Width, image.Height, image.Y, 0);
    }

}



private void OnImageAvailable(TextureReaderApi.ImageFormatType format, int width, int height, System.IntPtr pixelBuffer, int bufferSize)
{

    if (format != TextureReaderApi.ImageFormatType.ImageFormatColor)
    {
        Debug.Log("No edge detected due to incorrect image format.");
        return;
    }

    if (m_TextureRender == null || m_EdgeDetectionResultImage == null || m_TextureRender.width != width || m_TextureRender.height != height)
    {
        m_TextureRender = new Texture2D(width, height, TextureFormat.RGBA32, false, false);
        m_EdgeDetectionResultImage = new byte[width * height * 4];
        m_TextureRender.width = width;
        m_TextureRender.height = height;
    }

    System.Runtime.InteropServices.Marshal.Copy(pixelBuffer, m_EdgeDetectionResultImage, 0, bufferSize);

    // Update the rendering texture with the sampled image.
    m_TextureRender.LoadRawTextureData(m_EdgeDetectionResultImage);
    m_TextureRender.Apply();

    var encodedJpg = m_TextureRender.EncodeToJPG();
    var path = Application.persistentDataPath;

    File.WriteAllBytes(path + "/test2.jpg", encodedJpg);
}
  • If you want to save the camera viewport to an image there is a very simple way in Unity 2017 & 2018. Just use ScreenCapture.CaptureScreenshot https://docs.unity3d.com/ScriptReference/ScreenCapture.CaptureScreenshot.html – Magrones Apr 05 '18 at 21:26
  • Have you tried the answer of the post you quoted? https://stackoverflow.com/a/49580338/2506883 – Magrones Apr 05 '18 at 21:35
  • @Fenixrw That was the post i was referring to in the above ask. I think I might have got it going but need to test it to make sure. If so i will post the source. – Subha Andy Apr 06 '18 at 13:52
  • @Subha_Andy I know it is the same post, but you linked the Question and I the Answer of that post. Your code looks likes the Question's code a lot more than the Answer's code. Just wanted to be sure of which one you were using. – Magrones Apr 06 '18 at 16:50
  • @Fenixrw, I gotchya. Thanks! I actually did it quite similarly to my original code, just got rid of the udpate call (as that was updating the image incorrectly) and understood what step 2 to tie to the texture reader going. Thanks! – Subha Andy Apr 06 '18 at 17:19
  • Good to know you worked things out on your own ;) Congratz – Magrones Apr 06 '18 at 17:29
  • If I do save a 1920x1080 image, the ARCore features of drawing a 3d object on plane render very slowly – Subha Andy Apr 06 '18 at 23:35
  • Can you post images of the results? – Magrones Apr 06 '18 at 23:51

0 Answers0