1

I have a raw video file (testvideo_1000f.raw) that I am trying to stream in gray scale using ffmpeg and output the grayscale video to output.swf. The command I am using to do this is:

ffmpeg/ffmpeg -qmin 2 -qmax 31 -s 320x240 -f rawvideo -flags gray -pix_fmt:output gray -an -i testvideo_1000f.raw output.swf

However, the result from this command is a video stream that is in gray scale but still contains some of the chrominance data. The output from this command is pasted below:

    3 [volta]/home/student/elliott> ffmpeg/ffmpeg -qmin 2 -qmax 31 -s 320x240 -f rawvideo -flags gray -pix_fmt:output gray -an -i testvideo_1000f.raw output.swf
ffmpeg version N-41632-g2b1fc56 Copyright (c) 2000-2012 the FFmpeg developers
  built on Jul 29 2012 10:27:26 with gcc 4.1.2 20080704 (Red Hat 4.1.2-51)
  configuration: 
  libavutil      51. 58.100 / 51. 58.100
  libavcodec     54. 25.100 / 54. 25.100
  libavformat    54.  6.101 / 54.  6.101
  libavdevice    54.  0.100 / 54.  0.100
  libavfilter     2. 80.100 /  2. 80.100
  libswscale      2.  1.100 /  2.  1.100
  libswresample   0. 15.100 /  0. 15.100
*** CHOOSING 8
[rawvideo @ 0xdda9660] Estimating duration from bitrate, this may be inaccurate
Input #0, rawvideo, from 'testvideo_1000f.raw':
  Duration: N/A, start: 0.000000, bitrate: N/A
   Stream #0:0: Video: rawvideo (Y800 / 0x30303859), gray, 320x240, 25 tbr, 25 tbn, 25 tbc
File 'output.swf' already exists. Overwrite ? [y/N] y
w:320 h:240 pixfmt:gray tb:1/25 fr:25/1 sar:0/1 sws_param:flags=2
[ffmpeg_buffersink @ 0xddb7b40] No opaque field provided
[format @ 0xddb7d40] auto-inserting filter 'auto-inserted scaler 0' between the filter 'Parsed_null_0' and the filter 'format'
[auto-inserted scaler 0 @ 0xddb7920] w:320 h:240 fmt:gray sar:0/1 -> w:320 h:240 fmt:yuv420p sar:0/1 flags:0x4
*** CHOOSING 8
Output #0, swf, to 'output.swf':
  Metadata:
    encoder         : Lavf54.6.101
   Stream #0:0: Video: flv1, yuv420p, 320x240, q=2-31, 200 kb/s, 90k tbn, 25 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (rawvideo -> flv)
Press [q] to stop, [?] for help
Truncating packet of size 76800 to 1 2875kB time=00:00:40.84 bitrate= 576.7kbits/s    
frame= 1500 fps=1035 q=24.8 Lsize=    4194kB time=00:01:00.00 bitrate= 572.6kbits/s    
video:4166kB audio:0kB global headers:0kB muxing overhead 0.669245%

I am fairly new to FFMPEG and I am afraid I am using either the wrong syntax or the wrong parameters in my command line. For some reason, the format of the output is yuv420p. I have tried searching for this answer all over but have had no luck. Could anyone please help me and tell me why the output is being formatted in yuv420p when I am giving the command for it to be in 8bit grayscale? Any help would be greatly appreciated. Thank you.

Marc Elliott

user1657208
  • 11
  • 1
  • 2
  • I should also note that I am trying to get a grayscale video by completely deleting the chrominance information, not by just zeroing it out. I want to transmit only the luminance data. Thanks again. – user1657208 Sep 08 '12 at 23:25
  • See http://stackoverflow.com/q/8349352/220060 for similar information. – nalply Oct 01 '12 at 12:32

3 Answers3

1
ffmpeg -i VTS_05_1.VOB -pix_fmt gray -vcodec rawvideo -f yuv4mpegpipe - | ffmpeg -y -f yuv4mpegpipe -i - -vcodec libtheora out.avi
malat
  • 12,152
  • 13
  • 89
  • 158
Fl0
  • 166
  • 6
0

No ffmpeg flags will allow you to do this.

Video formats are designed for YUV and not only for Y only. So you will not be able to do this without modifying your approach. You will have to use mjpeg to get Y only stream. Mjpeg supports 8 bit output but I don't think mjpeg can be put in a swf. It can go into a mp4 or ts if that suffices for your purpose.

Ofcourse the other option is that the decode/display side only decodes/displays the luminance and not the chrominance. Again it is a custom requirement and not supported directly.

av501
  • 6,645
  • 2
  • 23
  • 34
  • Thanks for your input. If the video formats are not designed for the luminance only, how are the gray pixel formats that are defined in ffmpeg (GRAY8, GRAY16LE, GRAY16BE, etc.) implemented? It looks like the structure for these formats only have one component. Is this incorrect? – user1657208 Sep 11 '12 at 17:21
  • ffmpeg defines various formats.Not all work with all codecs. Eg. JPEG can come in GRAY8 but say MPEG2 is not. – av501 Sep 11 '12 at 17:52
  • Is there a list of which codes support which pixel formats anywhere in the ffmpeg documentation? – user1657208 Sep 11 '12 at 18:01
  • @user1657208, no because its to do with codec and their capabilities rather than ffmpeg as such. In general images will have most formats supported. Most standard video codecs [MPEG1/2/4, H.264, WMV7/9, RV, H.263] are YUV420 format only. Some support 422/444 as well. – av501 Sep 11 '12 at 18:06
  • In your original response, you said that the mjpeg codec could get just the Y component. However, when I implement this codec, I get a resulting video that is using the yuvj420p format. An output mp4 file would be fine with me if I could get it to work. Could you be so kind as to specify how I could get the Y using mjpeg? The command I am using is:ffmpeg/ffmpeg -qmin 2 -qmax 31 -s 320x240 -f rawvideo -flags gray -pix_fmt:output gray -an -i testvideo_1000f.raw -vcodec mjpeg output.mp4 Thanks again! – user1657208 Sep 12 '12 at 02:52
0

I know I am way late to this but I figure others might be interested. So just for clarity greyscale does have a yuv format 4:0:0. So given that you can look for ones tat support that format, HEVC is an example of one that does under certain profiles

https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding#Profiles

While FFMPEG does not have a pixel format for 400 it does have the gray one. You can use it like this:

ffmpeg -i .\INPUT.mkv -pix_fmt gray -c:v libx265 -preset medium -b:v 7000k   .\OUTPUT.mkv

The output will look like this if it supports it

Stream #0:0(eng): Video: hevc, gray(pc, progressive), 1480x1080 [SAR 1:1 DAR 37:27], q=2-31, 7000 kb/s, 23.98 fps, 1k tbn

Notice the grey instead of the yuv420