1

I'm supposed to write an app for iOS and Android that sometimes shows a customized video player on a part of the screen. I have to be able to control it (seek, play, pause, set speed, choose video...). I know that such media is not yet supported in Gluon.

But would it be possible to write such a thing in XCode and Android Studio and somehow embed it in a Gluon app?

Thorvaldur
  • 11
  • 2
  • I'm not sure about this. As far as I know, you could write JavaFX-Fragments on Android, so mixing up a native viewer Fragment with a JavaFX one would be possible. But from the performance point of view... I don't know. And iOS... No idea at all. Perhaps you should ask, what the MediaPlayer API of JavaFX can do on these platforms. – dzim Nov 21 '16 at 12:44
  • Thanks dzim. The reason I asked is that I believe that JavaFX media API can do nothing on Android or iOS yet. – Thorvaldur Nov 23 '16 at 09:37
  • That seems to be correct. According to this question here ( http://stackoverflow.com/questions/38419634/javafxports-how-to-call-android-native-media-player ), you could add at least audio capabilities by providing your own interface for the native player. But video. I'm honestly not sure. Maybe you should update your question accordingly (the "video" part is missing in the title) and perhaps the Gluon guys will try to give you an hand ore tell what's possible, and what's not. But I still fear, that embedding a video player will be next to impossible... – dzim Nov 23 '16 at 14:29
  • I needed to show videos as well and we settled with the `Intent` to the installed video player. You can find a How-to in my answer on this topic: http://stackoverflow.com/questions/40671626/android-share-audio-file-from-assets-folder/40932643#40932643 **#edit**: please up-vote an answer, when a StackOverflow post, I provided, actually helped... – dzim Dec 02 '16 at 12:58
  • And another idea: you could do it like Facebook with the chatheads: http://stackoverflow.com/questions/15975988/what-apis-in-android-is-facebook-using-to-create-chat-heads - where I assume the chathead could be anything like a regular, but **native** view. I guess there are SurfaceView-extensions for video-playing... – dzim Dec 02 '16 at 13:00
  • Thanks again. The project was postponed for a few weeks but I will definitely take a closer look at your solution when I'm reassigned to it. – Thorvaldur Dec 05 '16 at 08:08
  • 1
    @dzim, we make possible what seemed impossible... Have a look at my answer below. And it can be combined with your `VolumeService`:) – José Pereda Dec 17 '16 at 20:15

2 Answers2

1

Following the design patterns in the Gluon Charm Down library, this could be a basic Android implementation of a VideoService.

It is based on this tutorial, and adapted to be used on the current SurfaceView that JavaFX uses. It will create a TextureView that will be placed on the center of the screen, on top of the current view, taking the 95% of its width.

With the Gluon plugin for your IDE, create a Single View Project.

  1. Place these two classes under Source Packages, package com.gluonhq.charm.down.plugins:

VideoService interface

package com.gluonhq.charm.down.plugins;

public interface VideoService {
    void play(String videoName);
    void stop();
    void pause();
    void resume();
}

VideoServiceFactory class

package com.gluonhq.charm.down.plugins;

import com.gluonhq.charm.down.DefaultServiceFactory;

public class VideoServiceFactory extends DefaultServiceFactory<VideoService> {

    public VideoServiceFactory() {
        super(VideoService.class);
    }

}
  1. Android Package: Place this class under Android/Java Packages, package com.gluonhq.charm.down.plugins.android:

AndroidVideoService class

package com.gluonhq.charm.down.plugins.android;

import android.content.Context;
import android.content.res.AssetFileDescriptor;
import android.graphics.SurfaceTexture;
import android.media.MediaMetadataRetriever;
import android.media.MediaPlayer;
import android.util.DisplayMetrics;
import android.util.Log;
import android.view.Surface;
import android.view.TextureView;
import android.view.WindowManager;
import android.widget.RelativeLayout;
import com.gluonhq.charm.down.plugins.VideoService;
import java.io.IOException;
import javafxports.android.FXActivity;

public class AndroidVideoService implements VideoService, TextureView.SurfaceTextureListener {
    private static final String TAG = AndroidVideoService.class.getName();
    private MediaPlayer mMediaPlayer;
    private String videoName;

    private final RelativeLayout relativeLayout;
    private final TextureView textureView;
    private final DisplayMetrics displayMetrics;

    public AndroidVideoService() {
        displayMetrics = new DisplayMetrics();
        WindowManager windowManager = (WindowManager) FXActivity.getInstance().getSystemService(Context.WINDOW_SERVICE);
        windowManager.getDefaultDisplay().getMetrics(displayMetrics);

        relativeLayout = new RelativeLayout(FXActivity.getInstance());

        textureView = new TextureView(FXActivity.getInstance());
        textureView.setSurfaceTextureListener(this);
        relativeLayout.addView(textureView);
    }

    @Override
    public void play(String videoName) {
        this.videoName = videoName;
        stop();
        FXActivity.getInstance().runOnUiThread(() -> {
            FXActivity.getViewGroup().addView(relativeLayout);
        });
    }

    @Override
    public void stop() {
        if (mMediaPlayer != null) {
            mMediaPlayer.stop();
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
        if (relativeLayout != null) {
            FXActivity.getInstance().runOnUiThread(() -> {
                FXActivity.getViewGroup().removeView(relativeLayout);
            });
        }
    }

    @Override
    public void pause() {
        if (mMediaPlayer != null) {
            mMediaPlayer.pause();
        }
    }

    @Override
    public void resume() {
        if (mMediaPlayer != null) { 
            mMediaPlayer.start();
        }
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture st, int i, int i1) {
        Surface surface = new Surface(st);
        try {
            AssetFileDescriptor afd = FXActivity.getInstance().getAssets().openFd(videoName);
            calculateVideoSize(afd);
            mMediaPlayer = new MediaPlayer();
            mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
            mMediaPlayer.setSurface(surface);
            mMediaPlayer.setLooping(true);
            mMediaPlayer.prepareAsync();
            mMediaPlayer.setOnPreparedListener(mediaPlayer -> mediaPlayer.start());

        } catch (IllegalArgumentException | SecurityException | IllegalStateException | IOException e) {
            Log.d(TAG, e.getMessage());
        }
    }

    @Override public void onSurfaceTextureSizeChanged(SurfaceTexture st, int i, int i1) { }
    @Override public boolean onSurfaceTextureDestroyed(SurfaceTexture st) { return true; }
    @Override public void onSurfaceTextureUpdated(SurfaceTexture st) { }

    private void calculateVideoSize(AssetFileDescriptor afd) {
        try {
            MediaMetadataRetriever metaRetriever = new MediaMetadataRetriever();
            metaRetriever.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
            String height = metaRetriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT);
            String width = metaRetriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH);
            double factor = Double.parseDouble(width) > 0 ? Double.parseDouble(height) / Double.parseDouble(width) : 1d;
            // 95% screen width
            RelativeLayout.LayoutParams lp = new RelativeLayout.LayoutParams((int) (0.95 * displayMetrics.widthPixels), 
                    (int) (0.95 * displayMetrics.widthPixels * factor));
            lp.addRule(RelativeLayout.CENTER_IN_PARENT);
            textureView.setLayoutParams(lp);
        } catch (NumberFormatException e) {
            Log.d(TAG, e.getMessage());
        }
    }
}
  1. Sample

Place a video file in the android/assets folder, like big_buck_bunny.mp4, that can be downloaded from here.

BasicView

public class BasicView extends View {

    private boolean paused;

    public BasicView(String name) {
        super(name);
    }

    @Override
    protected void updateAppBar(AppBar appBar) {
        appBar.setNavIcon(MaterialDesignIcon.MENU.button());
        appBar.setTitleText("Video View");
        // big_buck_bunny.mp4 video in src/android/assets:
        Services.get(VideoService.class).ifPresent(video -> {
            appBar.getActionItems().add(MaterialDesignIcon.PLAY_ARROW.button(e -> video.play("big_buck_bunny.mp4")));
            appBar.getActionItems().add(MaterialDesignIcon.PAUSE.button(e -> {
                if (!paused) {
                    video.pause();
                    paused = true;
                } else {
                    video.resume();
                    paused = false;
                }
            }));
            appBar.getActionItems().add(MaterialDesignIcon.STOP.button(e -> video.stop()));
        });
    }

}

Deploy on your Android device and test:

Note that the TextureView will be on top until you remove it by pressing the stop button.

José Pereda
  • 44,311
  • 7
  • 104
  • 132
  • Nice!! Will try this out on Monday. First thing in the morning! – dzim Dec 17 '16 at 20:21
  • This looks very promising. Thank you. I will definitely try this out when I'm assigned to this again. – Thorvaldur Dec 19 '16 at 08:25
  • And do you think that an iOS implementation could be done too? What I'm thinking is that I would need something similar to the Android's onSurfaceTexture-methods behind the JNI to put my player on. – Thorvaldur Dec 20 '16 at 09:18
  • Sure, it is possible. I'll try to add it to the answer when I have the time. – José Pereda Dec 20 '16 at 11:35
  • @josé-pereda: Did you find the time to do the impl in IOS as well? Would be interessting to know... – dzim Jan 30 '17 at 07:18
  • @dzim yes, I have an iOS implementation [working](https://www.dropbox.com/s/ju0wschuygne80t/IMG_0308.jpg?dl=0). I had to use the `AVKit` framework that is not currently included in the fxmobile plugin though. – José Pereda Feb 05 '17 at 13:35
0

The native video player (or in this case the method of "previewing" a video) was used in the following example:

https://gist.github.com/bgmf/d87a2bac0a5623f359637a3da334f980

Beside some prerequisites, the code looks like this:

package my.application;

import ch.cnlab.disentis.util.Constants;
import org.robovm.apple.foundation.*;
import org.robovm.apple.uikit.UIApplication;
import org.robovm.apple.uikit.UIDocumentInteractionController;
import org.robovm.apple.uikit.UIDocumentInteractionControllerDelegateAdapter;
import org.robovm.apple.uikit.UIViewController;

import java.io.File;
import java.util.logging.Logger;

public class NativeVideoServiceIOS extends PathHelperIOS implements NativeVideoService {
    private static final Logger LOG = Logger.getLogger(NativeVideoServiceIOS.class.getName());

    public NativeVideoServiceIOS() {
        LOG.warning("Initialized Native Video Service with path: " + this.pathBase);
    }

    @Override
    public void triggerPlatformApp(String filename) {
        String fullfile = pathBase.getAbsolutePath() + filename;
        NSURL url = new NSURL(NSURLScheme.File, "", fullfile);
        UIDocumentInteractionController popup = new UIDocumentInteractionController(url);
        popup.setDelegate(new UIDocumentInteractionControllerDelegateAdapter() {
            @Override
            public UIViewController getViewControllerForPreview(UIDocumentInteractionController controller) {
                return UIApplication.getSharedApplication()
                        .getWindows().first().getRootViewController();
            }
        });
        popup.presentPreview(true);
    }

}
dzim
  • 1,131
  • 2
  • 12
  • 28