GPU upgrade time.

Received an RTX 2080 Super today, let's upgrade that old GTX 970.

Specifically, the EVGA RTX 2080 Super XC Gaming, the only non-reference RTX 2080 Super I found that was dual slot and would actually fit.
View attachment 194039


Also got myself a nice wireless mouse, I'm going to need it since I'm playing on the TV and the only wireless mouse I have is an old Logitech laptop mouse not designed for gaming.
View attachment 194040


Fits perfectly with just enough room for the cabling on the right side.
View attachment 194042


And with the case back on. Man, it sure would be nice if there wasn't so much cable.
View attachment 194043


It looks like the intake fan is lit up, but that's actually my motherboard shining on it. Haven't received the addressable RGB LED strip yet.
View attachment 194044


And finally some benchmarks comparing my old GTX 970 to the new RTX 2080 Super. The difference is night and day.
View attachment 194045

View attachment 194046

View attachment 194047


It actually maxed out the VR score... Guess I should get a nice VR headset. Not quite ready for future VR though.

It's not the most powerful RTX 2080 Super model given the smaller heatsink compared to all of the others, but there's still some headroom for overclocking. Pretty happy with the performance so far in the little testing I have done in games. I was considering returning it for a RTX 2070 Super which is supposed to have better bang for the buck, but I think I might need all of this performance, even if it's only 12% more FPS than the RTX 2070 Super and costs 33% more. Still a much better value than the 2080 Ti.

I'm also considering getting some CableMod cables. They're not exactly cheap, but even when I did my best job to bunch up the wires so they wouldn't get in the way, they're still making the window bulge out a little on the motherboard side. There's just too much cable and nowhere good to put it.
I did get a 20% off code with either the mobo or the PSU, so why not make use of it.
  • Like
Reactions: 12 people

Comments

I was going to wait a while longer, since Nvidia doesn't support FreeSync over HDMI 2.0 and AMD's offerings are less than great, but the GTX 970 was not doing my new build justice and I really wanted to take advantage of the new 4K TV. Like nothing would run at playable FPS at 4K except for Portal 2.
 
  • Like
Reactions: 1 person
Personally I'm really, really, REALLY hoping that nVidia supports HDMI 2.1 on their next-gen cards. One of those paired with those LG OLED's that support VRR would be a near-perfect couch gaming setup. The prices on those big OLED TV's has really come down, as well.
 
Why would you want HDMI 2.1? Display Port 2.0 almost is double the bandwidth and supports on up to 16K res... I would rather have that.
 
I'd just put V sync on didn't oled panels have a big amount of input delay at least compared to va or TN panels?
 
It i enable vsync then at worst it will nearly halve my framerate. In Zelda BotW for example where I get 45-60 FPS, but usually above 50, my FPS would drop all the way down to 30, which is a huge drop. Not worth it to me. In games that run locked at 60 anyway, I won't notice a difference with vsync on or off. So it's best just to leave it off.
Plus if I play a game at 1440p@120, I need to have vsync off to see framerates higher than 60 I think, since it'll never be locked at 120.
 
Yeah allthough you get the overall smoother game, unless your framerate is consistent above 100+, then you can leave it off anyway.
 
@Psionic Roshambo

Large, high quality consumer displays (i.e. OLED TV's) don't have DisplayPort. For my target use case (couch gaming), HDMI 2.1 is much more attractive than DisplayPort 2.0, given the available display hardware.

@Der_Blockbuster

If I recall correctly, OLED panels actually have much lower response time than the majority of LCD tech. Input latency comes mostly from other sources, at least on consumer TV's/ However, the LG OLED's actually have a "game mode" that turns off all of the garbage and also supports scanline refresh, so input latency is basically nothing. Much better than with any other consumer TV.
 
  • Like
Reactions: 1 person
Ayy I also just built a new computer. Moving from an i5-4690k and GTX 970 to a Ryzen 9 3950X and an RTX 2080 Super as well ;) Haven't hooked it up yet, probably will do that this weekend maybe.
 
I remember what a huge jump in performance it was going from a 970 to a 1070, and this is even crazier. The 2080 seems like a nice card for performance comparable to a 1080ti, but that MSRP just makes it so much less appealing.
 
@xzi You could save yourself 200-300€ by going for a 1080ti.
Alternatively, you could go for a high model of the rtx 2080 non super. Zotac amp extreme core for example. And you'd get the same fps as the super with additional superior cooling.
 
@PityOnU Okay looks like my informations were a little outdated. I really love LG TV's.
I love their motion controlled remote, it's so intuitive. I feel like living in rock age with my LG remote. It hurts... I'd wanted a oled so bad...I hate the glowing bars when watching movies...
But I honestly couldn't get myself to spend more than 600 bucks for a TV. My dad still uses his 2009 LG plasma TV and it's still running. My last TV just cracked on itself. The back light layer shattered, probably because temperature differences but who knows.
 
@PityOnU My Samsung Q70R (VA panel) has 6ms input lag at 120hz, 15ms at 60hz. Doesn't get much better than that.
Every single TV these days has a game mode, LG is not special. Input lag can sometimes still be pretty high in game mode, but it looks like from 2018 onwards, every new model has pretty low input lag in game mode. Didn't use to be the case. https://displaylag.com/display-database/
 
@guitarheroknight I have no preference when it comes to GPU brands, I just don't want another Zotac. The Zotac GTX 970 was both loud and hot.
 
@Der_Blockbuster RTX 2080 Super is considerably faster than the 1080 Ti. 2080 non super is more comparable to the 1080 Ti at around the same price too, plus you get RTX, so there is not much point in going with the 1080 Ti anymore unless you get a good deal on one used.

OLEDs seem nice but they have problems with burn in. For something that will be used a lot for gaming, or as a monitor, there is a high chance you will get burn in due to the static elements that tend to be in games (and a PC desktop is nothing but static elements, with the taskbar and all its icons, and windows that mostly stay in the same position)

An OLED would not actually have cost me that much more but I purposely did not want an OLED for that reason.
But dimmable zones do a pretty good job too, I can barely tell there's backlight glow at a normal viewing distance, blacks look nice and deep. Not noticed any glow on the top/bottom bars yet, I think they designed the dimming to avoid that, though there is some minor glow when there's white text on a black background (such as the credits of a movie), but as said, it's barely noticeable, and it doesn't really matter in the credits anyway.

TVs with dimmable zones are not cheap either though. You have to invest a fair bit of money to get a good TV, whether it's OLED or VA. At least this TV will last me a long time. I don't see myself ever wanting 8K, and I've been very happy with the contrast and black levels so far.

Also, while you may be able to OC a 2080 non Super to match the Super, it still wouldn't match an OC'd Super, so it's not like that makes the cards equivalent.
 
@Xzi Yeah, it is a bit pricy, a 2070 Super is much better bang for your buck, but i need all those frames :P

@Sicklyboy Nice. What are you gonna do with all those cores? I did not think anything higher than the 3700X would be worth it, as the 3800X only gives a marginal performance improvement, and the 3900X and 3950X are just too expensive. This PC is primarily for gaming anyway, so there is not much point to 12+ cores, other than for epeen.
I figure I'll just upgrade the CPU later down the road when I need to, since AMD has guaranteed backwards compatibility with AM4 until at least 2022. By the time I need a CPU with faster clock speeds or more cores they should be much cheaper. One of the main reasons I went with AMD actually. I never did any major upgrade to the CPU on my previous desktop because I would have had to get a new motherboard, and then new RAM too because new motherboards used DDR4, and I would rather just build a new rig.
 

Blog entry information

Author
The Real Jdbye
Views
1,021
Comments
72
Last update

More entries in Personal Blogs

More entries from The Real Jdbye

General chit-chat
Help Users
    Bunjolio @ Bunjolio: +1