3

For example, C # says that the selected image contains 96 ppi, while that same image in Photoshop contains 72 ppi.

Why is there a difference?

I’m inclined to trust Photoshop in this case, and how to test image resolution if C# returns false results?

We need to build some sort of validator control that rejects all images with ppi != 300.

Control should support the following formats: jpg, jpeg, gif, png, bmp.

Code is listed below:

Image i = Image.FromFile(FileName);

Console.Write(i.VerticalResolution);
Console.Write(i.HorizontalResolution);

2 Answers2

7

DPI means dots (pixels) per inch. The physical size in inches is subjective, based on the current monitor's size and resolution. Unless you're relying on metadata (which gif and bmp don't contain) you cannot reliably calculate this.

Photoshop simply has a prescribed value for DPI, which it uses when translating images for print. This value is stored in the PSD file and may be copied to JPEG metadata, but if you save the image in a format without DPI metadata, the information is not stored.

Update:

The reason your code gets a different value is that C# fetches its VerticalResolution and HorizontalResolution values from the current DPI setting on the computer. Photoshop's DPI is for use with print, so it knows the physical dimensions if you want to send your image to a printer. It has a default value of 72dpi, but you can change this. The value has no meaning on a screen, though, since screens deal in pixels only.

Polynomial
  • 27,674
  • 12
  • 80
  • 107
  • That's correct, however DPI != PPI. Even though _dots per inch_ and _pixels per inch_ are used interchangeably by many, they are not the same thing. – Paolo Moretti Dec 14 '11 at 15:04
  • Although screens do have their own DPI as screens have a physical size. So it's possible for an application to let you select the display resolution in DPI such that an image is scaled to appear a particular measured size based upon screen size. – Paul Ruane Dec 14 '11 at 15:06
  • DPI does not correspond directly with PPI because a printer may put down several dots to reproduce one pixel. Basically the term DPI refers to the resolution of the printing device, where PPI refers to the resolution of the image itself. – Paolo Moretti Dec 14 '11 at 15:13
  • @PaulRuane - Monitors don't report their physical size to the OS in any meaningul way. – Polynomial Dec 14 '11 at 15:38
  • @PaoloMoretti - This is correct. However, in Photoshop, DPI translates directly to PPI on a screen and DPI on a printer. – Polynomial Dec 14 '11 at 15:40
  • @Polynomial - Can I conclude that VerticalResolution and HorizontalResolution are useless since they depend on the computer running the program and do not give the actual image resolution (ie, on different computers, will return a different result)? If C# result is not in sync with Photoshop result, then it is useless to me. – Prosinac Decembar Dec 15 '11 at 09:19
  • @ProsinacDecembar - Yes, you can assume it's useless in this case. It's entirely dependant on the user's PPI setting (some people change it for small/large monitors or reading difficulties). There's no guarantee that any image, JPEG or not, will have the metadata available. – Polynomial Dec 15 '11 at 09:24
1

DPI means dots per inch. A bitmap image does not have an inherent DPI, it merely has a size which is the number of pixels in the horizontal and the number of pixels in the vertical (width and height). An image only gains a resolution (in DPI) when you say how many pixels you want to squeeze into each inch.

So if I have an image that is 100 pixels wide and 100 pixels high (100px × 100px), it will be 100 DPI if I print it (or convert it into a format that dictates the print size) such that it fits exactly in a one square inch (1" × 1"). It will be 50 DPI if I print it to fit in a square that is two inches by two inches, &c.

Paul Ruane
  • 37,459
  • 12
  • 63
  • 82