0

Most 1080p bluray rips that I've seen are in the range 30 GB - 45 GB, bit for bit.

Now, downconverted "1080p" videos are much smaller. Maybe as small as 2.5 GB, or say 5 GB, 8 GB.

As I understand it, we have these basic video parameters:

  1. frame size (1920*1080)
  2. frame rate
  3. color depth?
  4. codec type (h264, h265, etc) compression standard used

Now, sometimes when I open up a "1080p" movie file, the frame size is indeed 1920*1080, and the frame rate is approx. 24fps. This is the same as the bit-for-bit bluray rip.

However, the video itself is definitely not of the same quality - blacks are not deep, some extra grain.

So my question is - what is the information that is lost after doing a lossy downconversion, even though frame size and rates are the same? Somehow is the framesize actually much smaller, with multiple identical pixels being used as fillers? Or is it something else?

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Sujay Phadke
  • 2,145
  • 1
  • 22
  • 41
  • The H264 and H265 codecs have adjustable quality levels, often expressed as a bits-per-second target. That won't affect black levels, but it can certainly affect graininess, just like the JPEG quality setting. It should be clear, if the file is 10 times smaller you have lost an awful lot of data. – Tim Roberts Sep 04 '22 at 06:27
  • @TimRoberts agreed we lose the data. But what is that data exactly? is it lost unique pixels? so even though the frame size is still 1920*1080, those many unique pixels are not being represented in the data? Is that it? What all parameters make up the "quality", if FPS is still the same. – Sujay Phadke Sep 04 '22 at 06:28
  • There are many large books on how the MPEG codecs work. It can't be explained here. Most MPEG frames are actually "motion vectors", which builds a new frame by copying tiny sections of previous frames. As quality goes down, it picks looser and looser matches for those vectors. – Tim Roberts Sep 04 '22 at 06:32

0 Answers0