Hey again,
Well, several years ago our old 52" 4 x 3 format rear projection TV died, and so we got a new VIZIO 46" 16x9 format(equivalent viewing area of other tv) in LCD type but it was not a smart TV. Prices for LED TV's, especially the larger ones have come down considerably, and we're looking at acquiring one of the BIG ones, ie. 65" diagonal! Believe it or not...SWMBO was the one that suggested we get a new TV!
Anyways, started my usual internet searching and reviewing and have learned some new stuff. Remember, I work in the Eyecare field, and so I know that the human eye can't see image changes that occur at a rate of just above 50 hz, that's why we can't see the filament in a regular light bulb flicker, partly why electricity was set at 60hz, along with making is easier to have a clock keep time easily.
SO...now they have TV refresh rates listed as 120hz, and then they go up to 240 and even 960 hz. Kinda of a waste of technology considering we can't see any flicker on anything above 60hz anyways! Secondly, this whole UHD 4K screen resolution....geez! Standard TV started at 273 horizontal lines of resolution. Was okay for both broadcast bandwidth, the limitation of the viewing CRT scanner, and the largest screens of ~25" back then. Then with both the development of computers, and higher screen resolutions that eventually migrated into TVs in the HiDef range of 1080P which is about 4 times as sharp as standard TV, and it looks quite crisp and sharp at much larger viewing sizes.
Now, for the fun math. I remember learning that the human eye can resolve or see the separation/difference in a straight line(s)/edge of about 1/32" at 10 feet, a fairly common viewing distance especially for large screen TVs. That 1/32" is about 0.79mm.
This is from a wikipedia article on vision:
So...using that value, at 10 feet, the separate contours would be about 0.875mm apart, pretty close to that 1/32" value huh! A 16x9 format 60" screen is about 31" high, a 70" about 36" high. From a VIZIO owner's manual spec section, their 60" has a pixel pitch/size of 0.693mm, and their 70" is 0.801mm. A 1080 screen has 1080 horizontal lines of resolution, and at that 20/20 or 10/10 acuity of separation height of 0.875mm, the screen would be about 945mms, or 37" high, just a tad under 1 meter. So....a 60-70" screen at 1080 P is about right at the level of human clarity and perception, and if the separation size of the lines is any smaller, a human can't SEE IT!!!
Now, let's look at the 4K/UHD TV's, where they are cramming about 4000 horizontal lines of resolution, and so far most of these screens are also in the 60-70" range. To fit that many lines on the screen the pixel sizes would need to be from 0.175mm to 0.200mm high....which is way beyond the human eyes ability to see at that normal 10foot distance. A person would need to be 1/4 the distance away to be able to see the individual pixels/lines or about 2.5 feet!
So...in conclusion the 4K technology is useless and unnecessary for humans viewing it at ~10 feet or even 5 feet away! SAVE YOUR MONEY FOLKS!!!
Almost forgot, I also learned that each TV/Brand system uses a different type of operating system or such with regards to it's internet capabilities and APPs that they have/provide. My question is for those who have smart TV's, especially either Vizio or Samsung, how would you rate the use of the TV and the APPs? I learned that Vizio doesn't have a web browser in their collection of apps, even with their new V.I.A. Internet Apps. Samsung has been in the business a while and have quite a few more apps, like up to 1000, but not every app is available for every TV, seems dependent on the size of the screen as to what they allow the TV to access.
T.C.
Well, several years ago our old 52" 4 x 3 format rear projection TV died, and so we got a new VIZIO 46" 16x9 format(equivalent viewing area of other tv) in LCD type but it was not a smart TV. Prices for LED TV's, especially the larger ones have come down considerably, and we're looking at acquiring one of the BIG ones, ie. 65" diagonal! Believe it or not...SWMBO was the one that suggested we get a new TV!
Anyways, started my usual internet searching and reviewing and have learned some new stuff. Remember, I work in the Eyecare field, and so I know that the human eye can't see image changes that occur at a rate of just above 50 hz, that's why we can't see the filament in a regular light bulb flicker, partly why electricity was set at 60hz, along with making is easier to have a clock keep time easily.
SO...now they have TV refresh rates listed as 120hz, and then they go up to 240 and even 960 hz. Kinda of a waste of technology considering we can't see any flicker on anything above 60hz anyways! Secondly, this whole UHD 4K screen resolution....geez! Standard TV started at 273 horizontal lines of resolution. Was okay for both broadcast bandwidth, the limitation of the viewing CRT scanner, and the largest screens of ~25" back then. Then with both the development of computers, and higher screen resolutions that eventually migrated into TVs in the HiDef range of 1080P which is about 4 times as sharp as standard TV, and it looks quite crisp and sharp at much larger viewing sizes.
Now, for the fun math. I remember learning that the human eye can resolve or see the separation/difference in a straight line(s)/edge of about 1/32" at 10 feet, a fairly common viewing distance especially for large screen TVs. That 1/32" is about 0.79mm.
This is from a wikipedia article on vision:
Normal visual acuity is commonly referred to as 20/20 vision (even though acuity in normally sighted people is generally higher), the metric equivalent of which is 6/6 vision. At 20 feet or 6 meters, a human eye with nominal performance is able to separate contours that are approximately 1.75 mm apart.
Now, let's look at the 4K/UHD TV's, where they are cramming about 4000 horizontal lines of resolution, and so far most of these screens are also in the 60-70" range. To fit that many lines on the screen the pixel sizes would need to be from 0.175mm to 0.200mm high....which is way beyond the human eyes ability to see at that normal 10foot distance. A person would need to be 1/4 the distance away to be able to see the individual pixels/lines or about 2.5 feet!
So...in conclusion the 4K technology is useless and unnecessary for humans viewing it at ~10 feet or even 5 feet away! SAVE YOUR MONEY FOLKS!!!
Almost forgot, I also learned that each TV/Brand system uses a different type of operating system or such with regards to it's internet capabilities and APPs that they have/provide. My question is for those who have smart TV's, especially either Vizio or Samsung, how would you rate the use of the TV and the APPs? I learned that Vizio doesn't have a web browser in their collection of apps, even with their new V.I.A. Internet Apps. Samsung has been in the business a while and have quite a few more apps, like up to 1000, but not every app is available for every TV, seems dependent on the size of the screen as to what they allow the TV to access.
T.C.
Comment