UPDATE – a second article on DSR – Revisiting Nvidia’s DSR – 4k resolution on a 1080p monitor
Playing around with my PC over the last few days (having packed up my Xbox One for the upcoming house move) I was amazed to discover that Nvidia has come up with an amazing hardware trick you can use on any modern Nvidia video card.
It is called Dynamic Super Resolution or DSR for short.
I have tested this on a number of games and even though there is a performance hit it is absolutely incredible. On my Viewsonic 2703 monitor (1080p) I can achieve an amazing resolution of 2715 x 1527, even under Windows 8.1
The Vanishing of Ethan Carter looks absolutely unbelievable running at this resolution and even though the framerate drops a bit on my Nvidia GTX 760 video card it is more than fine for an adventure game.
I have heard that if you have a higher end graphics card you can achieve proper 4K resolution using DSR. As quoted from Nvidia themselves:
What does DSR do? Simply put, it renders a game at a higher, more detailed resolution and intelligently shrinks the result back down to the resolution of your monitor, giving you 4K, 3840×2160-quality graphics on any screen.
To give you a picture comparison here are two Vanishing of Ethan Carter screenshots I’ve taken – one at 1080p and the other at the max resolution I can get on a GTX 760 – 2715 x 1527 (click on each for full screen).
1080p
2715 x 1527
To be honest I was extremely skeptical in whether this new technique worked but it works amazingly well for a number of games. Some developers are even optimising it for their games (Blizzard is optimising DSR with their 6.1 patch coming out next week for World of Warcraft).
So what are you waiting for? Update to the latest Nvidia drivers and have a look at what resolutions you can bump up past with this amazing technology.
More on the subject can be found here – http://www.geforce.com/hardware/technology/dsr/technology
Supported Nvidia cards – http://www.geforce.com/hardware/technology/dsr/supported-gpus
Categories: Gaming, Technology
It’s a really clever idea that works well now that a lot of graphics cards have gigabytes of memory. Where antialiasing has to do a lot of guess work when smoothing out images, this method can take advantage of the higher detailed image to produce a much nicer result. It’s like when you shrink a photo in a paint package – the end result always looks better if the original picture was big to start with.
LikeLiked by 1 person