Talk:Pixel density

Latest comment: 4 years ago by 71.178.175.63 in topic HiDPI

HiDPI

edit

What is HiDPI? Wikipedia redirects the search of HiDPI to this page, but it appears nowhere in this article. — Preceding unsigned comment added by 67.163.157.221 (talk) 03:51, 12 November 2013 (UTC)Reply

It's defined in Named pixel densities 71.178.175.63 (talk) 17:15, 4 November 2020 (UTC)Reply

Definition

edit

The article said:

  "For example, a display that is 11 inches wide by 8.5 inches high, capable of a maximum 1024 by 768 pixel resolution, can display about 93 PPI in both the horizontal and vertical directions."

...changed it to 788, as 768/8.5 gives 90.3, not 93... —Preceding unsigned comment added by 132.207.137.202 (talk) 15:57, 12 June 2008 (UTC)Reply

Can anyone explain what this "Pixels per inch" has been used for?

It could also be mentioned, that Pixels per inch has spawned the myth that graphic for the screen should be in 72 dpi, since this is screen's resolution. More about it here: http://www.scantips.com/no72dpi.html Kasper Hviid 12:30, 10 Nov 2004 (UTC)

I'm not sure it's a myth, so much as a misunderstanding of purpose.

The author of the article you linked to appears to have a great deal of misunderstanding about DPI, as well. The article states "There is no concept of dpi on the video screen," which is, strictly speaking, true - DPI in the strictest sense is a metric for printer ink/toner dot size, and nothing more - but the remainder of the article treats DPI as synonymous with pixels per inch. The author states "Dpi means 'pixels per inch'", and then goes on to say "the concept of dpi simply does not exist in the video system", which is a curious contradiction. If pixels-per-inch can't exist in a video system (and pixels certainly don't make any sense outside of a video system), where does that leave us?

Some computer displays actually do display 72 pixels per inch--if you have a monitor that is, say, about 14.2 inches wide (like my 19-inch-diagonal monitor is), and run at a screen resolution of 1024x768, you get about 1024/14.2=72 pixels in a horizontal inch.

I don't know the actual historical reasons for the preponderance of the 72 DPI number. I suspect it was at least partly a guideline intended for graphic designers who were most familiar with the concept of output resolution as a DPI measurement; a designer working for print may have followed the notion "300 DPI looks good on paper," and carried that into the digital realm, where "72 DPI looks good on the computer screen" (again, DPI being synonymous with PPI). It does suffer from a misunderstanding of how DPI works, but let's use a simple example:

Say a photographer has a bunch of photos to publish; he wants to publish them in a book, and on his website. For print, it will be necessary to have a fairly high resolution in order for the images to look good. The photographer says, "I will make sure I use 300 DPI images for the printed photos." This ensures that, say, if the photos need to be printed at 8x8 inches, the image would need to be 2400x2400 pixels. His digital camera would need to be about 5 mexapixels, for optimal quality.

Now, the photographer wants to publish these photos on the web. Being new to this internet thing, he says "OK, I'll just publish these 2400x2400 images on my website." But after uploading them to his website, he realizes that's not a very good idea. "What I need," says the photographer, "is a target DPI for these images so I know when they will look good on the computer screen, instead of being way too big." The photographer gets out a ruler and measures 8 inches across his screen, and finds out that somewhere around 72 pixels for each inch would do the trick, and get the images to fit in an 8x8-inch area on the screen. So he scales them to what he perceives as "72 DPI" by going into Photoshop's "scale image" feature, choosing an 8x8-inch size, and "72 DPI". Photoshop scales the iamge to 576x576 pixels, and voila! The image looks good on the computer screen, without scrolling way off the edge.

Again, I've no idea if this is the way it happened. Probably not; it surely had something to do with the "logical inches" that the author talks about. I don't think the 72 DPI concept is totally useless, but it's definitely no substitute for a proper understanding of the way video display works. While video displays today commonly have more pixels per inch, I don't think it's so wrong to conceive of them as being in the ballpark of 72 DPI.

-- Wapcaplet 22:25, 10 Nov 2004 (UTC)

You are right, the author uses pdi in a wrong way. But his message is hard enough to swallow already! He might lose his audience if he said: "Images for the web does not need to be 72 dpi, and the resolution of an image is not called dpi either!"

About the purpose for ppi ... we could say: "The ppi-value is usefull, when the graphic on the monitor needs to be shown at the exact same size as the printout." That is, if this is the intended use of ppi. It could also be a way for monitor manufacturers to boast about their monitor's horse power.

As I understand it, the reason behind the 72-idea is that some old Mac's had a standard screen, which was 72 ppi. A point is 1/72 of an inch. So a screen with 72 ppi would have 1 point per pixel. This means that a 12 pixel font would be printed at 12 Points.

--Kasper Hviid 18:13, 11 Nov 2004 (UTC)

Also, Photoshop still uses 72 ppi as the default resolution of new images. I'm sure many users think of 72 ppi as some sort of ideal resolution because of this. --Stickler 09:13, 29 October 2007 (UTC)Reply

Article Issues

edit

1. Mention typical confusion with DPI -- even used commonly in manufacturer descriptions. In the DPI article, disambiguate: DPI refers to printing but often used incorrectly for monitors - for monitors see PPI.

-- Seems to be mentioned now, but rather confusingly. It says "...commonplace to refer to PPI as DPI, which is incorrect because PPI always refers to input resolution". The very first sentence of the article, however, starts a list with "displays" (not for touch input), and the very next sentence after the quoted one mentions photographs ("quality photographs usually require 300 pixels per inch"), so this whole "input/output" idea does not quite seem to be well-established. At least some source would need to be added to see where this idea (or the confusion ;) ) comes from. (Luna Kid (talk) 09:25, 1 December 2013 (UTC))Reply

2. Create "pixels per centimeter"/"PPCM" stubs that refers here for more info, have an inches-centimeters conversion chart for 72, 96, 106 PPI, whatever.

3. This article needs to be the place where the whole issue of how big something is on screen -- vs. resolution -- is clarified, with illustrations.

4. Complex issue of font sizing. The original Apple graphic design standard of 72 PPI, which gave 72 point text on screen the actual physical size on screen of one inch (72 point = 1 inch in computer typography) -- "what you saw was what you got." Then the rise of Windows in graphic design, and the shift to the 96 PPI standard. How Web browsers dealt with this.

5. The huge consumer problem of newer monitors with PPI's higher than 96 PPI, and how they think they're getting a "bigger" image -- but in fact there getting an image too small to read. The notorious 1280x1024 on 17" LCD problem, leading people to use non-native lower resolutions, often with distorted aspect ratios, etc.

6. Coordinate issues for cross-reference with the Resolution (computers) or whatever article(s). —Preceding unsigned comment added by 75.6.255.39 (talkcontribs) 07:25, 14 November 2006

Calculation

edit

I came across the following, is it correct? A 26" monitor with a resolution of 1920x1200 have a PPI of 87. sqrt(1920^2+1200^2)/26 = 87 —Preceding unsigned comment added by Frap (talkcontribs) 10:15, 22 February 2008 (UTC)Reply

Fixed the combined formula. The inverse aspect ratio term under the square root should be squared. PPI = (Rw/d)*sqrt(1+(Ah/Aw)^2) = (1024/12.1)*sqrt(1+(3/4)^2) = 84.628*sqrt(1.5625)= 105.79ppi —Preceding unsigned comment added by Dedsmith (talkcontribs) 04:21, 30 May 2008 (UTC)Reply

Pixel density and noise in digital cameras

edit

This article claims that the popular and mistaken myth of bigger pixels equaling better image quality (less noise). The dpreview-articele(s) on this issue is/are not factual, but also only repeating the myth.

In reality the tiny sensors of pocket cameras with their tightly packed pixels have better signal/noise ratio than the DSLR-cameras with the large photosites, surprising, but true. The real reason why DSLR-cameras produce better images is the fact that their sensors have roughly 10 times the light sensitive surface area.

The reason why this myth exist is the peoples habit of stydying noise by examining images that are magnified so that each pixel on the computer screen equals one pixel in the image. This is however problematic way of studying noise as for example if we have two sensors of equal size, sensor A having 5 Mp, sensor B having 10 Mp, on a 100% view it sure seems as if the 10 Mp sensor produces inferior signal/noise ratio, but in reality we are not comparing sensors A and sensors B, but sensor A versus a 5 Mp crop of sensor B (ie. comparing 100% of sensor A area with 50% of sendor B area). --88.195.100.60 (talk) 12:01, 4 March 2009 (UTC)Reply

Let's get some references, then feel free to edit. Kevin chen2003 (talk) 23:18, 10 September 2009 (UTC)Reply

English please

edit

" For instance, a 100x100-pixel image that is printed in a 1-inch square could be said to have 100 the printer's DPI capability."

Okay, for someone like me (layperson), this makes absolutely no sense. – Kerαunoςcopiagalaxies 14:07, 27 February 2010 (UTC)Reply

Disappointing article

edit

I'm struggling with understanding PPI with digital photography, and this article basically has zero information on it. Are images taken with a DSLR always 300 ppi? How come a vertical image is sometimes labeled as 72 ppi (at least for me; I use a Nikon D80)? What's the relationship between PPI and image size, if any? What's the relationship with PPI and printing quality, if any? The section I jumped to talked about photodiodes and something about cm2. This means nothing to me and isn't in any sort of context that I can understand. Also, run-on sentences, poor punctuation, so I tagged the article "copyedit." Please expand, cheers! – Kerαunoςcopiagalaxies 14:14, 27 February 2010 (UTC)Reply

  • PPI has nothing to do with cameras, you're thinking of DPI. You will be able to print higher-DPI (dots per inch) pictures if your camera's sensor can capture more megapixels. PPI (pixels per inch) is a completely irrelevant metric to measure a camera by, it's like measuring a boat's displacement by what color it is. Nalorcs (talk) 21:05, 2 October 2012 (UTC)Reply

Apple's iPhone 4 (highest display pixel density claim)

edit

The article claims that the iPhone 4 has "arguably the highest display pixel density actually available to the general public in a mass market device to date," but in reality, many modern digital cameras have the same or higher pixel density in their LCD displays. I only know this because I decided to do the math on the supposed ppi on my own camera, a Panasonic Lumix DMC-ZS3, which has a 3" (4:3) display that boasts 460,000 pixels. Doing the math, I came to the conclusion that it has 326 ppi, the same as the iPhone 4, and the Panasonic ZS3 has been commercially available since 2009, while its previous model, the Panasonic Lumix DMC-TZ5 had the same LCD specs, and was released in 2008.

The main problem with trying to calculate the ppi in digital camera displays is that they don't typically tell you what the resolution is, but the final approximate pixel count, so you have to work your way back from there, but if that's not enough to convince you that there has already been "mass market devices" with "higher display pixel density" than the iPhone 4, consider the Sony Cyber-shot DSC-G1, released in 2007, it has a 3.5" display (the same size as the iPhone 4), with a display count of 921,000 pixels. In comparison, the iPhone 4's display is 960x640, or 614,400 pixels, or more than 300,000 pixels less in the same display area. -Glenn W (talk) 05:37, 23 July 2010 (UTC)Reply

CORRECTION - The above is inaccurate

I've removed the quote in the article relating to the Sony DSC-G1, and the above claim is also inaccurate. However, I am not confirming or denying Apple's claim: I do not have the answer to that, just that the above statement is incorrect.

Camera manufacturers always measure their screens in 'dots' for some reason - this is not pixels. One pixel = 3 dots (red, green and blue). The argument for the Sony specifies 921,000 dots. That is actually a 640x480 screen: 640x480 = 307,200 pixels * 3 = 921,000 dots. So actually, it only has 228ppi, not the quoted 395 in the article (now removed). It is irritating that camera manufacturers do this, but there you go. — Preceding unsigned comment added by 94.143.111.35 (talk) 08:44, 19 October 2011 (UTC)Reply

Likewise, the Panasonic Lumix DMC-Z3 has a resolution of 480x320 with a PPI of 192.3. — Preceding unsigned comment added by 94.143.111.35 (talk) 08:58, 19 October 2011 (UTC)Reply

Image of the square

edit

I'm wondering given the accuracy an encyclopedia is supposed to have, how the image on the front can claim to be 200 pixels by 200 pixels - for use of measuring on a display and calculating PPI - If I copy that and display it on my Amstrad CPC, it will be a rectangle unless I am in Mode 1, because mode 2 the pixels are twice as tall as they are wide, and mode 0 they are twice as wide as they are tall. Assumptions that pixels are square are bad as it is not the case. talk 17:14, 17 March 2012 (UTC) — Preceding unsigned comment added by 27.32.141.11 (talk) Reply

Still a square, as in X units times X units, using Gaussian coordinates. — Preceding unsigned comment added by 205.232.191.16 (talk) 17:31, 8 May 2012 (UTC)Reply

so a square is a square because the coordinates not because the appearance? Meaning if its 2x2 and x isn't the same size as y its a square? I was led to believe the length of x and y had to equal to be a square obviously with the same unit of distance in standard units for both x and y. If each x is twice as large as each y then you obviously have a rectangle. ZhuLien 66.249.80.203 (talk) 09:01, 8 March 2014 (UTC)Reply
I know what you mean and don't want to argue about that. For the square example in the page however, if the "square" doesn't appear as a square of your screen it gives the added information that its pixels are non-square and then you can measure horizontal and vertical PPI independently. That is just a feature. Not that in that case you have two numbers. See my writings in other sections of the talk page. comp.arch (talk) 09:11, 21 March 2014 (UTC)Reply

200 x 200 pixel image

edit

This graphic is meaningless as it changes scale when you scale the total webpage up and down while scrolling the mouse wheel and holding down ctrl. — Preceding unsigned comment added by Ironbrooo (talkcontribs) 01:42, 20 March 2012 (UTC)Reply

There, I added the relevant qualifiers that detract from the article's readability to point out something that should be taken for obvious. 76.232.78.183 (talk) 03:35, 11 July 2012 (UTC)Reply

PenTile Displays

edit

As PenTile technology has already captured a significant portion of mobile display manufacture, is DPI now a much more useful attribute than it was when relevant portions of this article were written?

If anybody knows how to derive true pixel density- or traditional RGB stripe subpixel equivalent from PenTile manufacturers' claimed PPI, a new section would be a good place for that information. Patronanejo (talk) 08:50, 21 July 2012 (UTC)Reply

  • RGBG PenTile has 33% fewer subpixels, theoretically resulting in 33% less sharpness or "effective DPI". However, in real-world usage this often isn't as visible as it sounds, depending on the colors involved. It could be safely said that the screen's "effective DPI" is up to 33% less than the stated DPI... "Reality" isn't an encyclopedic source though, you'd have to locate a site that says this. Shouldn't be hard since it's simple math. Nalorcs (talk) 20:55, 2 October 2012 (UTC)Reply

300 DPI myth

edit

In my opinion, the text "It has been observed that the unaided human eye can generally not differentiate detail beyond 300 PPI" needs to be adjusted to be less ambiguous or even removed entirely. Saying that 300 DPI is the limit of vision is ridiculous nonsense, as it's a long-known fact that one can easily tell the difference between something printed at 300 DPI and 600 DPI. Common consumer printers now offer 1200 DPI and higher.

This 300 DPI thing has been popularized by Apple's "Retina" marketing and is just as untrue as the myth that the human eye can't distinguish more than 24 frames a second, something that anyone can disprove just by looking here where the difference between 30 and 60 FPS is demonstrated in realtime.

I think a more accurate statement might be, "Beyond approximately 300 PPI, a typical unaided human eye grows less able to discern increased pixel density." Nalorcs (talk) 20:49, 2 October 2012 (UTC)Reply

Current text reads
Some observations have indicated that the unaided human eye can generally not differentiate detail beyond 300 PPI;[4] however, this figure depends both on the distance between viewer and image, and the viewer’s visual acuity. The human eye also responds differently to bright, evenly lit and interactive display, than prints on paper.
That [4] by the way, is a link to a fairly credible statement that Apple is indeed onto something. More credible than your claims anyway... (take no offense, but a random user offering "facts" about "myths" that aren't sourced and not about the subject (hint: we're not discussing printers) carries less weight in my book than a retinal neuroscientist) CapnZapp (talk) 15:39, 12 January 2014 (UTC)Reply

Another big missing definition, "Quality Factor"

edit

The article mentions the term "quality factor" without definition. "This delivers a quality factor of 2, which delivers optimum quality. The lowest acceptable quality factor is considered to be 1.5 ... " The only "quality factor" I can find on web searches are a scientific oscillatory systems. What is an "optimium" quality factor? The acceptable quality "is considered" by whom? Bwagstaff (talk) 18:21, 26 August 2013 (UTC)Reply

I also find this strange. Why isn't there a wiki page on "quality factor", and why should quality factor be mentioned in the intro to the page and nowhere else? Dylan Thurston (talk) 22:04, 22 January 2015 (UTC)Reply

Also, the page the term is linked to appears to be for an unrelated term with the same name. 135.26.63.186 (talk) 17:27, 3 March 2016 (UTC)Reply

FYI: PPI calculation only works for square pixels and gives only horizontal or vertical PPI

edit

Note: All modern displays as far as I know use square pixel and have for a long time. My first thinking and input here that I since deleted was wrong so this is mostly academic. So no need to read further? Old CRTs at least, could be configured for different resolutions even non-square (do not have an inherant density).

Example: 1000 x 1000 pixels. 10 inces wide and high. Diagonal is calculated with Pythagoras, 14.14 inches. Horizontal PPI (and vertical) 1000/10 = 100 PPI. The length in pixel along a diagonal in pixels can't be any longer than 1000 (see manhattan distance). Then diagonal PPI is really 1000/14,14 = 70 PPI (article would still 1414/14.14 = 100 PPI). Both are kind of right.

Same display 1000 x 1000 squished to be 5 inces wide (double horizontal density but same vertical): diagonal length is now 11.18 inches. Article would calculate 1000/11.18 = 89.4 PPI.

If you however had cut the first screen in half 1000 x 500. Now it's "widescreen", horizontal and vertical density are still the same, 100 PPI. Suddenly actual calculated diagnonal density is: 1000/11.18 = 89.44 PPI (article would say 1118/11.18 = 100 PPI). comp.arch (talk) 15:21, 5 November 2013 (UTC)Reply

PPI vs. ppi

edit

"usually written in lowercase ppi" [1] might not be a good WP:RELIABLE source? The Wikipedia article is not consitently using one or the other - should it? Similar to DPI, I've always written in upper case myself (it is an acronym, byt so is ppcm.. that treally chould be lower case - the cm part at least.. PPcm? :) ) comp.arch (talk) 02:49, 11 January 2014 (UTC)Reply

I hesitate to convert the article to consistent lower case "ppi" even if "correct" (by analogy with dpi), according to IEEE (Computer) style guide]. Another IEEE style guide (not on spelling) contradicts the first and uses PPI and DPI. Can't find any style guide that says upper case should be used, probably because "incorrect", but is upper case used more and "should" be used in Wikipedia? comp.arch (talk) 20:41, 12 January 2014 (UTC)Reply

Use pixels per inch, not ppcm

edit

Sources talking pixel destiny always use ppi, never ppcm.

Please don't let this be another case of "kibytes", where Wikipedia took a proud stand for confusion and disarray by using units noone else uses. Regards, CapnZapp (talk) 15:28, 12 January 2014 (UTC)Reply

PPCM is used in some countries and will probably be used more in the future, so as long as it's included well it's not going to confuse anyone and there's no reason not to include it.FormularSumo (talk) 17:23, 8 October 2019 (UTC)Reply

Proposed merge with Retina Display

edit

Apple is now one of many companies selling phones and computers with screens with high pixel density. (Several editors predicted this in Retina Display's deletion discussion five years ago.) The article notes that in practice Apple has simply doubled or tripled the pixel density of its displays instead of following the 300 ppi declaration that Jobs made in 2010. Such displays are now commonplace, and we don't have any similar articles for other phone or display vendors' marketing terms. Hence, I would like to propose merging the content of two articles. Jc86035 (talk) 15:05, 5 October 2017 (UTC)Reply

  • Oppose merge. The reason other high-pixel density displays don't have articles is because they don't meet WP:NOTABILITY or just haven't yet had articles made on them. Apple's use of "Retina display" as a marketing term and the term's subsequent mention in multiple second-party sources meets WP:GNG. The fact that the display has multiple competitors of similar and better quality doesn't preclude it from having its own separate article. 93 (talk) 03:52, 14 June 2018 (UTC)Reply

HEY, when is someone going to remove that pesky "manual" template?

edit

It's been there since May 2014, and nobody's ever paid any attention to it.

I believe it should be speedily removed: What's WRONG with a guidebook, anyway?

I thought the purpose of an article is to be useful, regardless of style. --AVM (talk) 21:45, 9 January 2018 (UTC)Reply

  • I agree, so I removed it. Anyone who wants to add it back, I suggest they make their desired improvements instead of leaving it for some random person who they hoped will show up in the next few years. Jack N. Stock (talk) 01:21, 11 January 2018 (UTC)Reply

100% size?

edit

The first sentence of this article says this:

"Industry standard, good quality photographs usually require 300 pixels per inch, at 100% size"

This makes no sense. A number expressed as a percent indicates a fraction of something. But fraction of what? 100% of what? You can't resize a piece of paper, which is what the paragraph is talking about, so it must be talking about the size of the original image as it looked on the computer screen. But if you have an image on the computer screen that's 1 inch square, and you shrink it by 50%, a 1 inch region of your computer screen still contains the same number of pixels, it's just showing an altered image that essentially has every other pixel removed. Likewise, if you there's a 1 inch region of your computer screen, and it has a certain pixel density because of the resolution you are driving your monitor at, and you take an image and expand/resize it using an algorithm that interpolates colors between pixels, it's still the same number of pixels on the screen at the DPI of your monitor. So the amount that the source image has been enlarged, or shrunk, is totally irrelevant. The only relevant factor is whether that 1 inch square section of the paper, at 300 DPI, has enough fine detail for you to resolve details that you would have been able to resolve on a monitor at a given resolution. Or taking an analog photograph for example as the gold standard, assuming a photograph that's 1 inch square, you can resolve a certain level of detail, and we are asserting that 300 DPI can allow us to see that same level of detail. But if you put that same analog photo negative through an enlarger, you'll see more detail, because the photo negative actually includes finer detail than you can actually see. Or you could use the enlarger to shrink it. Either way, at the end, you get a new print, and there's a certain level of detail that you can visually ascertain. We are saying here that 300 DPI allows us to resolve that same level of detail. The zoom level of the photo just determines WHAT TYPE of details are actually being seen. It's a completely independent variable and it's irrelevant. — Preceding unsigned comment added by 2620:0:1003:1214:3595:2520:E774:2068 (talk) 23:06, 29 March 2018 (UTC)Reply

Citation no.9, HP LP2065 20-inch (50.8 cm) LCD Monitor - Specifications and Warranty Archived 2008-04-10 at the Wayback Machine (Hewlett-Packard Company official website), doesn't work anymore so should either be fixed or deleted. — Preceding unsigned comment added by FormularSumo (talkcontribs) 17:26, 8 October 2019 (UTC)Reply