by DYSEQTA » Sat Dec 07, 2013 3:38 am
Seems there might be a little confusion here as to screen size and screen resolution.
Screen size is simply a single linear measurement, most commonly in inches, from one corner of the displayable area to the diagonal opposite corner.
Screen resolution however defines the size of the pixel grid that is used to display images on the screen and is given as 2 numbers one for the X dimension and one for the Y dimension.
HDTVs these days are almost exclusively 1920x1080 resolution no matter what the screen size is. Obviously 4K is looking to change that by increasing resolution up to figures more like 3840 × 2160. Note that I say "more like" as there is no actual agreement yet on what 4K should be and instead currently refers to resolutions near 4000 horizontal pixels. This is similar to early HDTVs which were called 720p, which denotes a resolution of 1280x720, when in fact many were actually resolutions such as 1340x768.
Computer monitors on the other hand have varied resolutions that will almost always increase in step with the screen size increases.
When it comes to graphics processing the only thing that matters of these two measurements is the resolution as it defines then number of pixels (or fragments) that need to be processed per frame. The more pixels, the more graphics processing power required.
As a side note it has been said in various places around that 4K for TV/movie viewing really doesn't become something to worry about until you are talking screen sizes of 100" and up. Below that you are really just wasting money.
tl:dr TVs: Size doesn't really determine resolution, Monitors: Size and resolution tend to increase together. Higher resolution = more GPU power needed.