11

I have a bare-metal application running on a tiny 16 bit microcontroller (ST10) with 10BASE-T Ethernet (CS8900) and a Tcp/IP implementation based upon the EasyWeb project.

The application's main job is to control a led matrix display for public traffic passenger information. It generates display information with about about 41 fps and configurable display size of e.g. 160 × 32 pixel, 1 bit color depth (each led can be just either on or off).

Example:

enter image description here

There is a tiny webserver implemented, which provides the respective frame buffer content (equals to led matrix display content) as PNG or BMP for download (both uncompressed because of CPU load and 1 Bit color depth). So I can receive snapshots by e.g.:

wget http://$IP/content.png

or

wget http://$IP/content.bmp

or put appropriate html code into the controller's index.html to view that in a web browser. I also could write html / javascript code to update that picture periodically, e.g. each second so that the user can see changes of the display content.

Now for the next step, I want to provide the display content as some kind of video stream and then put appropriate html code to my index.html or just open that "streaming URI" with e.g. vlc.

As my framebuffer bitmaps are built uncompressed, I expect a constant bitrate.

I'm not sure what's the best way to start with this.

(1) Which video format is the most easy to generate if I already have a PNG for each frame (but I have that PNG only for a couple of milliseconds and cannot buffer it for a longer time)?

Note that my target system is very resource restricted in both memory and computing power.

(2) Which way for distribution over IP?

I already have some tcp sockets open for listening on port 80. I could stream the video over HTTP (after received) by using chunked transfer encoding (each frame as an own chunk). (Maybe HTTP Live Streaming doing like this?)

I'd also read about thinks like SCTP, RTP and RTSP but it looks like more work to implement this on my target. And as there is also the potential firewall drawback, I think I prefer HTTP for transport.

Please note, that the application is coded in plain C, without operating system or powerful libraries. All stuff is coded from the scratch, even the web server and PNG generation.

Edit 2017-09-14, tryout with APNG

As suggested by Nominal Animal, I gave a try with using APNG.

I'd extend my code to produce appropriate fcTL and fdAT chunks for each frame and provide that bla.apng with HTTP Content-Type image/apng.

After downloading those bla.apng it looks useful when e.g. opening in firefox or chrome (but not in konqueror, vlc, dragon player, gwenview).

Trying to stream that apng works nicely but only with firefox. Chrome wants first to download the file completely.

So APNG might be a solution, but with the disadvantage that it currently only works with firefox. After further testing I found out, that 32 Bit versions of Firefox (55.0.2) crashing after about 1h of APNG playback were about 100 MiB of data has been transfered in this time. Looks that they don't discard old / obsolete frames.

Further restrictions: As APNG needs to have a 32 bit "sequence number" at each animation chunk (need 2 for each frame), there might to be a limit for the maximum playback duration. But for my frame rate of 24 ms this duration limit is at about 600 days and so I could live with.

Note that APNG mime type was specified by mozilla.org to be image/apng. But in my tests I found out that it's a bit better supported when my HTTP server delivers APNG with Content-Type image/png instead. E.g. Chromium and Safari on iOS will play my APNG files after download (but still not streaming). Even the wikipedia server delivers e.g. this beach ball APNG with Content-Type image/png.

Edit 2017-09-17, tryout with animated GIF

As also suggested by Nominal Animal, I now tried animated GIF.

Looks ok in some browsers and viewers after complete download (of e.g. 100 or 1000 frames).

Trying live streaming it looks ok in Firefox, Chrome, Opera, Rekonq and Safari (on macOS Sierra). Not working Safari (on OSX El Capitan and iOS 10.3.1), Konqueror, vlc, dragon player, gwenview. E.g. Safari (tested on iOS 10.3.3 and OSX El Capitan) first want to download the gif completely before display / playback.

Drawback of using GIF: For some reason (e.g. cpu usage) I don't want to implement data compression for the generated frame pictures. For e.g. PNG, I use uncompressed data in IDAT chunk and for a 160x32 PNG with 1 Bit color depth a got about 740 Byte for each frame. But when using GIF without compression, especially for 1 Bit black/white bitmaps, it blows up the pixel data by factor 3-4.

Joe
  • 3,090
  • 6
  • 37
  • 55
  • 1
    Have you considered MJPEG (https://en.wikipedia.org/wiki/Motion_JPEG) over HTTP? This is very simple and supported by browsers. Unfortunately, there's no MPNG. But it may be simpler to add a JPG encoder to your system than to add complex streaming code.I believe many webcams work like that. – Simon Mourier Sep 13 '17 at 13:01
  • Are you running linux on your embedded device? You prefer to stream in a raw format or do you like to use an encoder? – Maarten Arits Sep 13 '17 at 16:02
  • No OS at all, it's bare-metal. So cannot use ffmpeg, avconv et cetera. – Joe Sep 13 '17 at 16:43
  • @Simon: This pushes me to need to create a jpeg encoder first. I'd checked the jpeg format description, but it seems not to be easy in plain C without any libraries. – Joe Sep 13 '17 at 16:47
  • Well, I guess it's JPG (encoding only) vs HSL vs RTSP... JPG has many open source C implementations. This one is super light, monochrome only https://github.com/Moodstocks/jpec but could be enough in your context. There's also this: http://www.jonolick.com/code.html – Simon Mourier Sep 13 '17 at 17:26
  • @Joe: Emit the animation in [APNG](https://en.wikipedia.org/wiki/Animated_Portable_Network_Graphics) format. This should require minimal effort from your microcontroller, but allow most browsers to view the animation directly. – Nominal Animal Sep 13 '17 at 21:23
  • @Nominal Animal: I'd tried out your APNG suggestion and made it work but only with firefox. I'd added further details about this to my question above. Will trying JPEG → MJPEG next … needs some hours of research, coding and testing ;-) – Joe Sep 14 '17 at 18:37
  • @Joe: If you wanted maximal compatibility, GIF animation would be your best bet, but unfortunately the compressed data is different to PNG. In fact, if I were you, I'd consider switching to GIF entirely. (Fortunately, GIF is no longer encumbered by patents, and is actually even easier to implement than PNG support.) – Nominal Animal Sep 15 '17 at 01:00
  • Now tried out with animated GIF and it also looks not sufficient good; added details to my questions above. – Joe Sep 17 '17 at 19:35
  • Do you need a single consumer, or multiple consumers (displays)? What is the max latency you can forgive? – Alex Cohn Sep 17 '17 at 19:50
  • Multiple consumers would be a nice feature, but single client is ok for my actual requirements (I even think that multiple consumer delivery needs to forget HTTP for transport). Latency should be max about 1s. Clients should not cache or buffer. In case of low connection speed it's better to loose or skip frames instead of growing latency. I currently also do frame skipping in my APNG and animated GIF tryouts to keep latency low. – Joe Sep 18 '17 at 06:51
  • 1
    How about providing the raw framebuffer data as a binary (or base64) blob via AJAX, with the HTML (Canvas) doing the graphics work? It might work best if you used two HTTP connections, one for the streaming frame blobs, and the other streaming the other way, acknowledging each received full frame. The server would have one full XML blob buffer, and can update the frame data in-place. Server would only update the frame once per every received byte (from the client-to-server HTTP stream), to avoid the case of too many frames in-flight causes the client to lag behind. – Nominal Animal Sep 18 '17 at 12:45
  • 1
    Websockets would be another option, but the handshake part needs calculation of a SHA1 hash. You could then send the entire framebuffer as-is, in a single binary chunk (with fixed 8-byte header), without other processing, with just 8 bytes of overhead per frame. The client-to-server acks would be 8 bytes long. For the HTML5 Canvas side, you can use a 256-entry table to expand each framebuffer byte to 24 bytes of [ImageData data](https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API/Tutorial/Pixel_manipulation_with_canvas). I think this is the most promising approach myself. – Nominal Animal Sep 18 '17 at 13:17
  • If I understand right, those both approaches needs matching javascript code on the html page, isn't it? So it would not result in an URI which might be used for streaming via e.g. vlc. However, it could be a solution if I drop the vlc streaming requirement. – Joe Sep 18 '17 at 15:46
  • Stumbled into this conversation.... At first glance I would drop TCP/IP and drop the graphics rendering from the constraints on the embedded device. Have you considered just letting it blast out a UDP packet with the live data (just variables, no graphics?) You will pick up about 2 to 3 orders of magnitude faster execution in the embedded device and likely drop your lag significantly. It will require the listener(s) to render the data however they see fit. At this frame rate you can count any dropped packets as a sort of connection health metric. – grambo Sep 18 '17 at 16:58
  • @Joe: You could use an intermediate cache server, that obtains the websocket streams from one or more devices, and provides the data in various formats, transcoded. Then again, there is then no need to use websockets, as the cache server could just receive the raw frames as UDP datagrams (for 160x32 1bpp, 640 bytes per datagram); and no need to integrate a web server to the embedded device. – Nominal Animal Sep 19 '17 at 10:22
  • 160x32 1bpp is just an example configuration. Display size can be larger so that the frame buffer content not longer fits into a ~1.5k ethernet / IP frame. – Joe Sep 19 '17 at 15:57
  • I found an explanation for why GIF size blows up for 1 bit images [here](http://manual.freeshell.org/libungif/UNCOMPRESSED_GIF), in the message from Dr. Tom Lane. – Evidlo May 05 '20 at 20:26

2 Answers2

5

At first, embedded low-level devices not very friendly with very complex modern web browsers. It very bad idea to "connect" such sides. But if you have tech spec with this strong requirements...

MJPEG is well known for streaming video, but in your case it is very bad, as requires much CPU resources and produces bad compression ratio and high graphics quality impact. This is nature of jpeg compression - it's best with photographs (images with many gradients), but bad with pixel art (images with sharp lines).

Looks that they don't discard old / obsolete frames.

And this is correct behavior, since this is not video, but animation format and can be repeated! Exactly same will be with GIF format. Case with MJPEG may be better, as this is established as video stream.

If I were doing this project, I would do something like this:

  1. No browser AT ALL. Write very simple native player with winapi or some low-level library to just create window, receive UDP packet and display binary data. In controller part, you must just fill udp packets and send it to client. UDP protocol is better for realtime streaming, it's drop packets (frames) in case of latency, very simple to maintain.

  2. Stream with TCP, but raw data (1 bit per pixel). TCP will always produce some latency and caching, you can't avoid it. Same as before, but you don't need handshaking mechanism for starting video stream. Also, you can write your application in old good technologies like Flash and Applets, read raw socket and place your app in webpage.

  3. You can try to stream AVI files with raw data over TCP (HTTP). Without indexes, it will unplayable almost everywhere, except VLC. Strange solution, but if you can't write client code and wand VLC - it will work.

  4. You can write transcoder on intermediate server. For example, your controller sent UDP packets to this server, server transcode it in h264 and streams via RTMP to youtube... Your clients can play it with browsers, VLC, stream will in good quality upto few mbits/sec. But you need some server.

  5. And finally, I think this is best solution: send to client only text, coordinates, animations and so on, everything what renders your controller. With Emscripten, you can convert your sources to JS and write exact same renderer in browser. As transport, you can use websockets or some tricks with long-long HTML page with multiple <script> elements, like we do in older days.

Please, tell me, which country/city have this public traffic passenger information display? It looks very cool. In my city every bus already have LED panel, but it just shows static text, it's just awful that the huge potential of the devices is not used.

bukkojot
  • 1,526
  • 1
  • 11
  • 16
  • We already have some special client software. The video streaming capability is just an additional way to view the display content on devices were our client software was not installed, e.g. smartphone et cetera. This should work without additional intermediate server. We have lots of installations around europe. As our development and manufacturing is in Berlin, Germany, most installations were in Germany (Deutsche Bahn, Berlin U-Bahn et cetera) but also for car traffic information, car parking and so on (see [www.eeo-gmbh.de](http://www.eeo-gmbh.de)). What's your city? – Joe Sep 19 '17 at 15:52
  • About "Looks that they don't discard old / obsolete frames.": It was one my apprehension against APNG and GIF that clients might behave like this. Regardless, that I attribute the animations to don't repeat / loop the frames. – Joe Sep 20 '17 at 09:04
4

Have you tried just piping this through a websocket and handling the binary data in javascript?

Every websocket frame sent would match a frame of your animation.

you would then take this data and draw it into an html canvas. This would work on every browser with websocket support - which would be quite a lot - and would give you all the flexibility you need. (and the player could be more high end than the "encoder" in the embedded device)

light_303
  • 2,101
  • 2
  • 18
  • 35
  • This approach leads to forget the capability of having a standard protocol. So I cannot e.g. connect vlc for streaming the live picture from my micro controller. – Joe Sep 19 '17 at 15:39
  • 3
    it think the cross section between efficient video formats that do not require compression and play in all browsers plus VLC no not very large. So probably having two specialized http endpoints would be easier to implement than a "one-fits-all" solution – light_303 Sep 19 '17 at 15:44