I'm currently receiving images from an external source as byte array and I would like to send it as raw video format via ffmpeg to a stream URL, where I have a RTSP server that receives RTSP streams (a similar unanswered question). However, I haven't worked with FFMPEG in Java, so i can't find an example on how to do it. I have a callback that copies the image bytes to a byte array as follows:
public class MainActivity extends Activity {
final String rtmp_url = "rtmp://192.168.0.12:1935/live/test";
private int PREVIEW_WIDTH = 384;
private int PREVIEW_HEIGHT = 292;
private String TAG = "MainActivity";
String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
final String command[] = {ffmpeg,
"-y", //Add "-re" for simulated readtime streaming.
"-f", "rawvideo",
"-vcodec", "rawvideo",
"-pix_fmt", "bgr24",
"-s", (Integer.toString(PREVIEW_WIDTH) + "x" + Integer.toString(PREVIEW_HEIGHT)),
"-r", "10",
"-i", "pipe:",
"-c:v", "libx264",
"-pix_fmt", "yuv420p",
"-preset", "ultrafast",
"-f", "flv",
rtmp_url};
private UVCCamera mUVCCamera;
public void handleStartPreview(Object surface) throws InterruptedException, IOException {
Log.e(TAG, "handleStartPreview:mUVCCamera" + mUVCCamera + " mIsPreviewing:");
if ((mUVCCamera == null)) return;
Log.e(TAG, "handleStartPreview2 ");
try {
mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, 0, UVCCamera.DEFAULT_BANDWIDTH, 0);
Log.e(TAG, "handleStartPreview3 mWidth: " + mWidth + "mHeight:" + mHeight);
} catch (IllegalArgumentException e) {
try {
// fallback to YUV mode
mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, UVCCamera.DEFAULT_PREVIEW_MODE, UVCCamera.DEFAULT_BANDWIDTH, 0);
Log.e(TAG, "handleStartPreview4");
} catch (IllegalArgumentException e1) {
callOnError(e1);
return;
}
}
Log.e(TAG, "handleStartPreview: startPreview1");
int result = mUVCCamera.startPreview();
mUVCCamera.setFrameCallback(mIFrameCallback, UVCCamera.PIXEL_FORMAT_RGBX);
mUVCCamera.startCapture();
Toast.makeText(MainActivity.this,"Camera Started",Toast.LENGTH_SHORT).show();
ProcessBuilder pb = new ProcessBuilder(command);
pb.redirectErrorStream(true);
Process process = pb.start();
BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
OutputStream writer = process.getOutputStream();
byte img[] = new byte[192*108*3];
for (int i = 0; i < 10; i++)
{
for (int y = 0; y < 108; y++)
{
for (int x = 0; x < 192; x++)
{
byte r = (byte)((x * y + i) % 255);
byte g = (byte)((x * y + i*10) % 255);
byte b = (byte)((x * y + i*20) % 255);
img[(y*192 + x)*3] = b;
img[(y*192 + x)*3+1] = g;
img[(y*192 + x)*3+2] = r;
}
}
writer.write(img);
}
writer.close();
String line;
while ((line = reader.readLine()) != null)
{
System.out.println(line);
}
process.waitFor();
}
public static void buildRawFrame(Mat img, int i)
{
int p = img.cols() / 60;
img.setTo(new Scalar(60, 60, 60));
String text = Integer.toString(i+1);
int font = Imgproc.FONT_HERSHEY_SIMPLEX;
Point pos = new Point(img.cols()/2-p*10*(text.length()), img.rows()/2+p*10);
Imgproc.putText(img, text, pos, font, p, new Scalar(255, 30, 30), p*2); //Blue number
}
Additionally: Android Camera Capture using FFmpeg
uses ffmpeg to capture frame by frame from native android camera and instead of pushing it via RTMP, they used to generate a video file as output. Although how the image was passed via ffmpeg was not informed.
frameData is my byte array and I'd like to know how can I write the necessary ffmpeg commands using ProcessBuilder to send an image via RTSP using ffmpeg for a given URL.
An example of what I am trying to do, In Python 3 I could easily do it by doing:
import cv2
import numpy as np
import socket
import sys
import pickle
import struct
import subprocess
fps = 25
width = 224
height = 224
rtmp_url = 'rtmp://192.168.0.13:1935/live/test'
command = ['ffmpeg',
'-y',
'-f', 'rawvideo',
'-vcodec', 'rawvideo',
'-pix_fmt', 'bgr24',
'-s', "{}x{}".format(width, height),
'-r', str(fps),
'-i', '-',
'-c:v', 'libx264',
'-pix_fmt', 'yuv420p',
'-preset', 'ultrafast',
'-f', 'flv',
rtmp_url]
p = subprocess.Popen(command, stdin=subprocess.PIPE)
while(True):
frame = np.random.randint([255], size=(224, 224, 3))
frame = frame.astype(np.uint8)
p.stdin.write(frame.tobytes())
I would like to do the same thing in Android
Update: I can reproduce @Rotem 's answer on Netbeans although, in Android I am getting NullPointer exception error when trying to execute pb.start().
Process: com.infiRay.XthermMini, PID: 32089
java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
at com.infiRay.XthermMini.MainActivity.handleStartPreview(MainActivity.java:512)
at com.infiRay.XthermMini.MainActivity.startPreview(MainActivity.java:563)
at com.infiRay.XthermMini.MainActivity.access$1000(MainActivity.java:49)
at com.infiRay.XthermMini.MainActivity$3.onConnect(MainActivity.java:316)
at com.serenegiant.usb.USBMonitor$3.run(USBMonitor.java:620)
at android.os.Handler.handleCallback(Handler.java:938)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loopOnce(Looper.java:226)
at android.os.Looper.loop(Looper.java:313)
at android.os.HandlerThread.run(HandlerThread.java:67)
2022-06-02 11:47:20.300 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.304 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.304 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.308 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.312 32089-32089/com.infiRay.XthermMini E/MainActivity: onPause:
2022-06-02 11:47:20.314 32089-32581/com.infiRay.XthermMini I/Process: Sending signal. PID: 32089 SIG: 9