0

I am developing an android application using libstreaming streaming library. The app is sending upstream on Wowza (Mobile to Wowza). I created one surfaceview which has the camera preview. It's working fine but I want to add three functionalities (Zoom in/out, Autofous and Flash).

I don't know whether it is possible with Libstreaming.

SurfaceView which I have used, belongs to package net.majorkernelpanic.streaming.gl.SurfaceView.

Below is my Activity code:

public class LiveStreamingActivity extends Activity implements RtspClient.Callback, Session.Callback, SurfaceHolder.Callback {
private static SurfaceView mSurfaceView;
private SurfaceHolder mHolder;
private Session mSession;// Rtsp session
private static RtspClient mClient;

protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
    getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN);
    requestWindowFeature(Window.FEATURE_NO_TITLE);

    setContentView(R.layout.activity_main);

    if (!LibsChecker.checkVitamioLibs(this))
        return;
    mSurfaceView = (SurfaceView) findViewById(R.id.surface_view);

    mHolder = mSurfaceView.getHolder();
    mHolder.addCallback(this);
    mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

}

@SuppressWarnings("deprecation")
private void initRtspClient() {
    // Configures the SessionBuilder
    mSession = SessionBuilder
            .getInstance()
            .setContext(getApplicationContext())
            .setAudioEncoder(SessionBuilder.AUDIO_AAC)
            .setAudioQuality(new AudioQuality(8000, 16000))
            .setVideoEncoder(SessionBuilder.VIDEO_H264)
            //.setVideoQuality(new VideoQuality(352, 288, 30, 300000))
            .setCamera(CameraInfo.CAMERA_FACING_BACK)
            .setSurfaceView(mSurfaceView).setPreviewOrientation(0)
            .setCallback(this).build();

    mClient = new RtspClient();
    mClient.setSession(mSession);

    mClient.setCallback(this);
    mClient.setTransportMode(RtspClient.TRANSPORT_TCP);
    mSurfaceView.setAspectRatioMode(SurfaceView.ASPECT_RATIO_PREVIEW);

    String ip, port, path;
    Pattern uri = Pattern.compile("rtsp://(.+):(\\d+)/(.+)");
    Matcher m = uri.matcher("rtsp://219.65.90.226:1935/app2/myStream");
    m.find();
    ip = m.group(1);
    port = m.group(2);
    path = m.group(3);

    mClient.setCredentials(AppConfig.PUBLISHER_USERNAME,
            AppConfig.PUBLISHER_PASSWORD);
    mClient.setServerAddress(ip, Integer.parseInt(port));
    mClient.setStreamPath("/" + path);
}

@Override
protected void onResume() {System.out.println("on Resume activity 2");
    super.onResume();
    try{
        if(null != mSurfaceView){
            /* Broadcastreceiver: check network connectivity */   
            IntentFilter intentFilter = new IntentFilter();
            intentFilter.addAction("android.net.conn.CONNECTIVITY_CHANGE");
            registerReceiver(receiver, intentFilter);

            /* Start audio streaming background thread: AsyncTask */
            vmPlayer = null;
            vmPlayer = new MediaPlayer(this);
            audioStream= new AudioStreamTask(this);
            audioStream.execute("push","push","push");
        }
    }catch(Exception ex){
        ex.printStackTrace();
    }
}

@Override
protected void onPause() {
    super.onPause();
    try{
        /* release the surface view */
        if(null != mSurfaceView){
            mClient.release();
            mSession.release();
            mSurfaceView.getHolder().removeCallback(this);
        }
    }catch(Exception ex){
        ex.printStackTrace();
    }
}

@Override
public void onDestroy() {
    try {
        super.onDestroy();
        if (mClient != null) {
            mClient.release();
        }
        if (mSession != null) {
            mSession.release();
        }
        mSurfaceView.getHolder().removeCallback(this);
    } catch (Exception e) {
        System.out.println("Error while destroying activity " + e);
    }
}

private void toggleStreaming() {
    if (!mClient.isStreaming()) {
        // Start camera preview
        mSession.startPreview();
        // mFrontSession.startPreview();
        // Start video stream
        mClient.startStream();
        //startRtmpStream();
    } else {
        // already streaming, stop streaming
        // stop camera preview
        mSession.stopPreview();
        // mFrontSession.stopPreview();
        // stop streaming
        mClient.stopStream();
    }
}}

activity_main.xml

<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/surface_layout"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@android:color/black"
android:orientation="vertical" >

<LinearLayout
    android:id="@+id/surface_view_layout"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
    android:layout_weight="1"
    android:orientation="vertical" >

    <net.majorkernelpanic.streaming.gl.SurfaceView
        android:id="@+id/surface_view"
        android:layout_width="fill_parent"
        android:layout_height="fill_parent"
        android:layout_gravity="center" />
</LinearLayout></FrameLayout>

I need complete description to add all these three camera features.

halfer
  • 19,824
  • 17
  • 99
  • 186
Ram
  • 3,887
  • 4
  • 27
  • 49

2 Answers2

1

I did it! :)

Go to VideoStream.Java and change: protected Camera mCamera to public static Camera mCamera.

Go to your MainActivity, in your case LiveStreamingActivity and paste:

private float mDist;

@Override
public boolean onTouchEvent(MotionEvent event) {
    // Get the pointer ID

    Camera.Parameters params = VideoStream.mCamera.getParameters();
    int action = event.getAction();


    if (event.getPointerCount() > 1) {
        // handle multi-touch events
        if (action == MotionEvent.ACTION_POINTER_DOWN) {
            mDist = getFingerSpacing(event);
        } else if (action == MotionEvent.ACTION_MOVE && params.isZoomSupported()) {
            VideoStream.mCamera.cancelAutoFocus();
            handleZoom(event, params);
        }
    } else {
        // handle single touch events
        if (action == MotionEvent.ACTION_UP) {
            handleFocus(event, params);
        }
    }
    return true;
}

private void handleZoom(MotionEvent event, Camera.Parameters params) {
    int maxZoom = params.getMaxZoom();
    int zoom = params.getZoom();
    float newDist = getFingerSpacing(event);
    if (newDist > mDist) {
        //zoom in
        if (zoom < maxZoom)
            zoom++;
    } else if (newDist < mDist) {
        //zoom out
        if (zoom > 0)
            zoom--;
    }
    mDist = newDist;
    params.setZoom(zoom);
    VideoStream.mCamera.setParameters(params);
}

public void handleFocus(MotionEvent event, Camera.Parameters params) {
    int pointerId = event.getPointerId(0);
    int pointerIndex = event.findPointerIndex(pointerId);
    // Get the pointer's current position
    float x = event.getX(pointerIndex);
    float y = event.getY(pointerIndex);

    List<String> supportedFocusModes = params.getSupportedFocusModes();
    if (supportedFocusModes != null && supportedFocusModes.contains(Camera.Parameters.FOCUS_MODE_AUTO)) {
        VideoStream.mCamera.autoFocus(new Camera.AutoFocusCallback() {
            @Override
            public void onAutoFocus(boolean b, Camera camera) {
                // currently set to auto-focus on single touch
            }
        });
    }
}

/**
 * Determine the space between the first two fingers
 */
private float getFingerSpacing(MotionEvent event) {
    // ...
    float x = event.getX(0) - event.getX(1);
    float y = event.getY(0) - event.getY(1);
    return FloatMath.sqrt(x * x + y * y);
}

Based here.

Let me know if it helped!

jcunhafonte
  • 429
  • 7
  • 17
  • Hey Jose, Thanks for your response, I ll let you know once I am done with testing your code. – Ram Dec 21 '15 at 06:59
  • One more thing, do you any idea about camera HD quality streaming? I did modification in libstreaming VideoQuality.java class setVideoQuality(640,480, 20,400000), It streams in HD quality but the issue is, It shows green blocky screen on WOWZA server JWPlayer. Please help if you have any idea... – Ram Dec 21 '15 at 07:05
  • Hm i have no ideia Ram Sharan Do you have WOWZA on your localhost or in web server? – jcunhafonte Dec 22 '15 at 10:36
  • As per my updated question, I have solved zooming(with your help) and flash(did myself). But on-touch-focus still left. Any idea about the focus, I mean, on touch It should display a focused circle? – Ram Dec 22 '15 at 12:11
  • the code which you provided for auto-focus is not making any difference on single touch. but zooming is working flawlessly. – Ram Dec 22 '15 at 12:41
  • Now focus is working as well. Check this out https://github.com/fyhertz/libstreaming/issues/164#issuecomment-168209463 – Ram Jan 03 '16 at 10:08
1

Thank you @José Cunha Fonte your code is great!

For me (Works with Marshmallow SDK) return FloatMath.sqrt(x * x + y * y); is deprecated and gone, so I just changed to return (float)Math.sqrt(x * x + y * y);

Hope it will help someone :)

rischan
  • 187
  • 1
  • 4