Is 8K Gaming Turning a Corner? – Part 2
Part 1 of this article on 8K gaming explored where 8K gaming is on the hype cycle and the advantages of native 8K gameplay on 8K display. In Part 2, we explore how upscaling figures into the landscape.
DLSS
Deep Learning Super Sampling or DLSS is an Nvidia technology developed initially to improve framerates. Video is rendered at a lower resolution than displayed and then upscaled using the latest Artificial Intelligence. Whether rendering should be in “native resolution” or use DLSS raises passions in the community and muddies the waters when thinking specifically about 8K gaming.

The results of benchmarks comparing DLSS vs. native resolution rendering have surprised many pundits, with DLSS performing better than expected, especially in its latest version (see here, for example). One hypothesis is that the output of gaming graphics engines is
predictable to an AI – at least way more so than real-world video.
Resolution, Field-of-View, and Artistic Intent
A typical field of view (FOV) when playing today is 90°. Widening the FOV typically requires more pixels.

Gaming and GPU guru Jon Peddie told me that with his current setup described below.

He gets to play most games with a 120° FOV. 20/20 vision corresponds to 60 pixels per angular degree. Assuming an ideal screen size and distance, 7200 horizontal pixels are needed, showing the relevance of 8K resolution.
Jon’s gaming rig: I9 11th gen Intel processor, 64GB of RAM, Several TB of storage, 32″ DELL 8K monitor, Nvidia RTX 3090 Graphics card.
Games, especially shooters, will often have a FOV setting determining how far from the center of the screen the main action will occur. Here we touch on a hobby horse I have discussed several times. The use of the screen real-estate is determined by a complex mix of artistic and technical limitations, but also what we’re used to. Film and game makers, viewers, and players are used to having most of the action happen in the center of the screen. That’s storytelling adapted to technical limitations that no longer exist. When you see a 4K or 8K image with a wide-angle shot, there is much to look at on all parts of the screen. Such an image “feels” like a shot out of a documentary. It’s neither cinematic nor game-like. This perception is set to change as game makers embrace Ultra HD (4K and then 8K).
Refresh Rate and Frame rate
Sometimes there is confusion between refresh rates and frames rates. The refresh rate is controlled by the display and describes how many images the screen can display per second and is expressed in Hertz (Hz). Casual gamers often won’t notice improvements above 60 Hz. But competitive gamers care deeply about refresh rates. 144 Hz is the new baseline for most, with 240 Hz being the new high-end.
The frame rate is controlled by the GPU/CPU and is expressed in frames per second (so, also fps, misleadingly the same acronym as First Person Shooter). This measure of fps defines a similar concept to refresh rate, but this time, it’s the ability of the game’s graphical engine (software) to produce images that is being measured. Frame rate is highly dependent on the options selected by the player and hardware performance. So unlike refresh rate, which is wholly display-hardware dependent, frame rates depend on the playback devices’ hardware and software and user preferences.
High refresh and frame rates lower the perception of latency in a game. This quality is critical for competitive eSports, especially with an FPS game like Counterstrike. Bad latency can make you see your enemy a few frames after your opponent, and you’re already dead before you can shoot. So ideally, both refresh and frame rates should be high, and the former should be the highest. For competitive gamers, the frame rate is never too high, even if it is higher than the refresh rate. That is because when the screen does refresh, the frame it used can be a few milliseconds ahead.
As resolution increases, so does the human visual system’s ability to pick up certain artifacts like stutter or tear. That’s why, for example, the Ultra HD Forum specifies the minimum frame rate for 4K at 60 fps and recommends 120 fps. Nobody has yet come out of the woods with 8K in defining a minimum refresh rate, but I believe that when they do, they will specify 120 fps as the minimum for any 8K action content.
Refresh Rate vs. Resolution
Marek Maciejewski, Product Development Director Europe at TCL Europe, and Jon Peddie agree that 4K at 120 Hz is today’s goal for the mass market. I agree too. Note that this specific 120 Hz is a refresh rate from the world of video rather than gaming which has gone from 60 Hz to 144 Hz.

We’re still a way off, though. Marek pointed out that 8K native game rendering typically reaches only around 17 fps. With DLSS, you can get to 40 to 60 fps for the same content with the Guardians of the Galaxy game based on tested he did in October 2021. Marek noted that one advantage of this game as a tech demo was the super lifelike water reflections with DLSS.
With Call of Duty Vanguard, TCL produced an excellent 8K output from an AMD RX 6700 XT, but Flight Simulator at 8K was stuck at 15 FPS. Marek explained that Flight Simulator only supports Gsync, not VRR. However, a 4K native output used by a TCL 8K TV’s up-scaler looks great on an 85″ display, especially with something like the Airbus A320 cockpit.

Coming back to our discussion about the Hype Cycle, this is where I have a strong contention. Yes, frame and refresh rate are what gamers care most about now. But that is because they have traditionally been the bottleneck. In this informative video, Linus demonstrates how the jump in perceived quality when moving from 60 Hz to 144 Hz is an order of magnitude greater than jumping from 144 Hz to 240 Hz. Each time we improve, the difference is less noticeable.
So, there will come a time when an improved refresh or frame rate will no longer make a difference, even to professional players. What happens then? Assuming that HDR is well deployed by then, the next barrier for improvement can only be in resolution.
8K gaming and energy
The energy consumption concerns around 8K video, in general, are addressed in our op-ed here. Gaming brings further demands with power-hungry graphics cards. Apple has shown that ARM-based silicon can be several orders of magnitude more energy-efficient for video processing. The current M1 and M2 chips are not designed for gaming, but they show that reducing energy consumption by at least ten-fold is possible for video playback, which will eventually be the case for gaming too.
Wrap
Marek pointed out that from a market perspective, there are still too few extreme gamers – like Jon – for a company like TCL to bring a dedicated offering to the market. But Jon is adamant that soon, 8K could be a must-have for gamers. We really are in that waiting game.
Although frame rates concern serious gamers more than resolution, for now, we know that that will start to change once the baseline is well above 60 Hz. In researching this article, I confirmed my understanding that true 8K gaming must begin at least 120 fps.