6

I am developing an app which has the functionality of sharing screens with other apps.

I used the Media projection API for this. I also used MediaMuxer to combine the audio and video outputs for screen sharing.

I know that Media Projection APIs are used for screen recording but all I want is to share the screen while recording.

For this, I have modified the writeSampleData method of the MediaMuxer class to send bytes via a socket to the other device over the network.

Below is the code for that:

OutputStream outStream;

outStream = ScreenRecordingActivity.getInstance().socket.getOutputStream();

void writeSampleData(final int trackIndex, final ByteBuffer byteBuf, final MediaCodec.BufferInfo bufferInfo) {
    if (mStatredCount > 0) {
        mMediaMuxer.writeSampleData(trackIndex, byteBuf, bufferInfo);

        if (bufferInfo.size != 0) {

            byteBuf.position(bufferInfo.offset);
            byteBuf.limit(bufferInfo.offset + bufferInfo.size);

            if (outStream != null) {

                try {

                    byte[] bytes = new byte[byteBuf.remaining()];
                    byteBuf.get(bytes);

                    //Send the data
                    outStream.write(bytes);
                    outStream.flush();

                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }
    }
}

The bytes are successfully transferred via socket and I am also able to receive these bytes at the receiver's end.

Below is the code for receiving bytes at the receiver's end:

private class SocketThread implements Runnable {
    @Override
    public void run() {

        Socket socket;
        try {
            serverSocket = new ServerSocket(SERVER_PORT);
        } catch (IOException e) {
            e.printStackTrace();
        }

        if (null != serverSocket) {
            while (!Thread.currentThread().isInterrupted()) {
                try {
                    socket = serverSocket.accept();
                    CommunicationThread commThread = new CommunicationThread(socket);
                    new Thread(commThread).start();
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }
    }


    class CommunicationThread implements Runnable {

        InputStream in;
        DataInputStream dis;

        public CommunicationThread(Socket clientSocket) {


            updateMessage("Server Started...");
        }

        public void run() {           

            while (!Thread.currentThread().isInterrupted()) {

                try {                       

                    byte[] data = new byte[512];               

                } catch (Exception e) {                    

                    e.printStackTrace();

                    try {
                        fos.close();
                    } catch (Exception e1) {

                        e1.printStackTrace();
                    }
                }
            }
        }
    }
}

I followed the following links for Screen Sharing:

Screen capture

screenrecorder

Screen recording with mediaProjection

I used some code from the above examples to make an app.

All I want to know is how to handle the bytes at the receiver. How do I format these bytes to play a live stream from the sender's side?

Am I following the correct approach for sending and receiving byte data?

Does MediaProjection allow one to stream the Screen while recording between applications?

Any help will be deeply appreciated.

gbenroscience
  • 994
  • 2
  • 10
  • 33
Ankit Kamboj
  • 141
  • 2
  • 13

1 Answers1

3

Generally for streaming, including screen sharing, the audio and video tracks are not muxed. Instead, each video frame and audio sample is sent using a protocol like RTP/RTSP, in which each data chunk is wrapped with other things like timestamps.

You can take a look at spyadroid which is a good starting point for streaming audio and video over RTSP to a browser or VLC. It streams the camera and microphone but you can adapt it for your own use case.

If you want to go with sockets for the moment, you have to get rid of the MediaMuxer and send frames/samples directly from the Encoder output, appended with timestamps at least to synchronize the playback in the receiver side, after sending CSDs - assuming that you encode in h.264 format - data (SPS PPS aka csd-0 and csd-1 that you can get when the encoder format is changed) to the receiver Decoder, which you can configure with an output surface to render your stream.

Some extra links :

android-h264-stream-demo

RTMP Java Muxer for Android

RTSP

RTP

WebRTC

gbenroscience
  • 994
  • 2
  • 10
  • 33
E.Abdel
  • 1,992
  • 1
  • 13
  • 24