1

I'm having problems with ffmpeg, probably due to my inexperience with this software.

My basic need is the following: I have a series of videos with material that I want to protect so that it is not plagiarized. For this I want to add a watermark so that when a user views it, they also see some personal data that prevents them from downloading and sharing it without permission.

What I would like is to create a small Angular + Java application that does this task (invoking ffmpeg via Runtime#exec)

I have seen that from ffmpeg I can emit to a server, like ffserver but I wonder if there is a somewhat simpler way. Something like launching the ffmpeg command from my java application with the necessary configuration and having ffmpeg emit the video along with the watermark through some port/protocol.

EDIT

I have continued to investigate and I have seen that ffmpeg allows you to broadcast for WebRTC, but you need an adapter. What I would like and I don't know if it is possible is to launch ffmpeg so that it acts as a server and it can be consumed from the web.

Jose A. Matarán
  • 1,044
  • 3
  • 13
  • 33
  • _"...Invoking FFmpeg one way or another"_ That's called running an external process. Java has a **Process** API for that. Not sure what operating system you're using but try to read tutorials for it and run a simple app. Try running FFmpeg with some arguments (options) as a test like convert input JPG to output PNG or MP4. If working okay, then finally test telling FFmpeg to output video to a server location/port. Those are the steps to try and ask a question about... – VC.One Jan 25 '23 at 19:41
  • My problem is not how to call ffmpeg from java, my problem is what arguments to pass to it to act as a server – Jose A. Matarán Jan 26 '23 at 05:36
  • FFmpeg cannot be used as a server. Your options are.. **(Option 1)** Make your own computer (IP) open to the internet, then FFmpeg can output to that same IP/port and outside listeners can see the video stream. You open by configuring your router, and forwarding one of its ports (same port as FFmpeg output). **(Option 2)** Just get webspace (that can run PHP) and put there a script that receives input data (video chunks) and stores them on server. FFmpeg can output to the url of server script, and listeners can connect to the stream via a second script that manages multi listener connects. – VC.One Feb 03 '23 at 08:51
  • PS: **(Option 3)** Is to use FFserver. I think it does temporary port forwarding silently in the background... Further options include not using FFmpeg and instead use an encryption tool like WideVine, **..or...** Chrome/Edge browsers can decode/encode video frames (using their WebCodecs API) so you can modify pixels of input frames before outputting to connected user (via sockets or web server script). **...or...** Prepare the videos in such a way that frames can be replaced dynamically with a frame that has "user information" (_eg:_ every X mins your video cuts to a blue screen for 4 secs) – VC.One Feb 03 '23 at 09:47

1 Answers1

2

I don't have a Java example but we do something similar in our WebRTC .NET application. The code should be fairly straightforward to port to Java.

The RTP packets received in the reader can be streamed over most Java WebRTC libraries.