That would depend on the resolution of the printing. If for example, it is 100 pixels per inch, then 2 ft = 24 in (times 100 pix per inch) = 2400 x 2400 = 5,760,000
*
From a mathematical perspective, the answer is sound. From a Photography perspective, the asker should know that it is a relationship between the "resolution" that the image was taken at versus the size of the viewed image. The reason for the quotes is in my last sentence.
In a 640x320 image (A), there are twice as many pixels in both directions than one of 320x160 (B). Therefore, the A image is twice the length and height as the B image and the pixels and spaces between are no different in size from one picture to the other. Thus there is no difference in the eye's ability to "resolve" the spaces in between for a given viewing distance. What does matter is the size that the images are rendered at. To make this as simple as possible, it is best to compare photographs printed from these two images. An 8x10 from A will look better because there are more pixels crammed in a given inch than there will be in picture B, thus your eye's ability to resolve the spaces between them is not as good.
So how many pixels are in an inch for a GIVEN photo size depends on the "resolution" of the image file versus the size it is viewed at. The "resolution of the printing" is a matter for the printer or machine that creates the print, and that is another animal altogether.
The problem is, pixel count is not supported as a measure of digital photograph resolution by international standards even though it is used by just about everybody. Hopefully, my answer sheds some light as to why.
Copyright © 2026 eLLeNow.com All Rights Reserved.