4

I am about to grab the video output of my raspberry pi to pass it to kinda adalight ambient lightning system. The XBMC's player for PI, omxplayer, users OpenMAX API for decoding and other functions.

Looking into the code gives the following:
m_omx_tunnel_sched.Initialize(&m_omx_sched, m_omx_sched.GetOutputPort(), &m_omx_render, m_omx_render.GetInputPort());

as far as I understand, this sets a pipeline between the video scheduler and the renderer [S]-->[R].

Now my idea is to write a grabber component and plug-in it hardly into the pipeline [S]-->[G]->[R]. The grabber will extract the pixels from the framebuffer and pass it to a deamon which will drive the leds.

Now I am about to dig into OpenMAX API which seems to be pretty weird. Where should I start? Is it a feasible approach?

Best Regards

Stasik
  • 2,568
  • 1
  • 25
  • 44

2 Answers2

3

If you want the decoded data then just do not send to the renderer. Instead of rendering, take the data and do whatever you want to do. The decoded data should be taken from the output port of the video_decode OpenMAX IL component. I suppose you'll also need to set the correct output pixel format, so set the component output port to the correct format you need, so the conversion is done by the GPU (YUV or RGB565 are available).

Luca Carlon
  • 9,546
  • 13
  • 59
  • 91
  • is there some event handler that i can get called when frame decoding is finished? – Stasik Dec 21 '12 at 15:31
  • How do i actually read the data out of the decoder? Just by calling GetOutputBuffer()? – Stasik Dec 21 '12 at 15:37
  • The FillBufferDone callback will receive the buffers when the output port is not tunneled. Read the OpenMAX documentation for more information. – Luca Carlon Dec 21 '12 at 16:32
  • So i can also do it with a video_splitter. The second approach is to add an write_still (image capturer) element and write the single images to a ram drive. Do you think it is much more resource wasting than reading image data inside of the player? – Stasik Dec 21 '12 at 16:42
  • Why would you do that? Do you prefer a file instead of a buffer? It would at least add a buffer copy. And then it depends on the settings. – Luca Carlon Dec 21 '12 at 17:08
  • Well writing to /dev/shm will allow me to decouple to image extraction from a image analysis which can do polling on this file with at smaller rate/priority. – Stasik Dec 21 '12 at 20:27
0

At first i think you should attach a buffer to the output of camera component, do everything you want with that frame in the CPU, and send a frame through a buffer attached to the input port of the render, its not going to be a trivial task, since there is little documentation about OpenMax on the raspberry.

Best place to start: https://jan.newmarch.name/RPi/

Best place to have on hands: http://home.nouwen.name/RaspberryPi/documentation/ilcomponents/index.html

Next best place: source codes distributed across the internet.

Good luck.