2nd December 2002, 09:50 AM
3rd December 2002, 12:35 PM
1) In computers, dots per inch (dpi) is a measure of the sharpness (that is, the density of illuminated points) on a display screen. The dot pitch determines the absolute limit of the possible dots per inch. However, the displayed resolution of pixels (picture elements) that is set up for the display is usually not as fine as the dot pitch. The dots per inch for a given picture resolution will differ based on the overall screen size since the same number of pixels are being spread out over a different space. Some users prefer the term "pixels per inch (ppi)" as a measure of display image sharpness, reserving dpi for use with the print medium.
2) In printing, dots per inch (dpi) is the usual measure of printed image quality on the paper. The average personal computer printer today provides 300 dpi or 600 dpi. Choosing the higher print quality usually reduces the speed of printing each page.
In computers, pixels per inch (ppi) is a measure of the sharpness (that is, the density of illuminated points) on a display screen. The dot pitch determines the absolute limit of the possible pixels per inch. However, the displayed resolution of pixels (picture elements) that is set up for the display is usually not as fine as the dot pitch. The pixels per inch for a given picture resolution will differ based on the overall screen size since the same number of pixels are being spread out over a different space. The term "dots per inch (dpi)," extended from the print medium, is sometimes used instead of pixels per inch.
In short, LPI stands for, Linux Professional Institute, you can find more
information here http://www.lpi.org
Last edited by ninelives; 3rd December 2002 at 12:40 PM.
3rd December 2002, 12:36 PM
Not familiar with LPI.
DPI and PPI are often used interchangeably (but that's not correct).
PPI applies to IMAGES. Specifies image resolution in the number of pixels in an inch. So, a 300ppi image has 300 image pixels in 1 inch of the image. PPI makes no sense if the dimension (in inches) is not specified. Therefore, the phrase "make your image 72ppi" does not make sense. A 2048 x 1536 image can be 72ppi, so can a 640x480, or even a 72x72 image. It only makes sense when specified like "4 x 6 inches at 300ppi", etc.
DPI is used in printing, where your printout is defined by DOTS and not pixels. This is where the confusion comes in. Unlike images, the term "1440dpi printout" makes more sense as the actual output size is known (you are holding and looking at the print what)
Printer resolution and image resolution is different. It can take more than 1 dot to make 1 pixel on paper. That's why you have printers with 1440dpi but the actual optimal resolution is closer to 240~360dpi as each pixel needs more than 1 dot to be printed at that particular colour.
For optimal output, you need anywhere from 150 to 300ppi of IMAGE resolution. Higher than that, there won't be much visible difference. This 150-300ppi of image information is then printed at the PRINTER resolution of e.g. 1440dpi.
3rd December 2002, 01:13 PM
LPI (lines/inch), not DPI, is used to measure halftone.
If your scanner got a "descreen" or "de-moire" function, notice that it is measured in LPI instead of DPI. Because "descreen" is used for removing halftone/moire patterns in some prints (matte, newspaper etc).
3rd December 2002, 01:19 PM
3rd December 2002, 01:23 PM
Halftone, in layman's terms, is the use of relative density of tiny dots to reproduce photo images on paper or other media. Commonly used in newspapers, magazines etc. Look carefully into the details of the pictures in the newspaper, the image isn't very smooth but are rendered using tiny dots of varying densities.
LPI sort of measures this density IIRC.