Maf Posted November 5, 2024 Share Posted November 5, 2024 My point is have RT on, play for a bit so you’re used to it, then go to any kind of vista and turn the RT off. If ever want to know what the difference between RT/No RT, HDR/No HDR, Shadows on/off, etc. the absolute best way to do it is go from on>off. Menus that show the difference immediately so you can watch all the detail/colour disappear is even better. I haven’t done this with Witcher but open world games like Elden Ring, Spider-Man and Horizon are really good at showing the difference Link to comment Share on other sites More sharing options...
Metroid66 Posted November 5, 2024 Share Posted November 5, 2024 Yeah, I'm definitely not getting it with this game. It still looks incredible on the settings I've gone with, but the rt isn't really rocking my world. I do get the 'pre rendered' thing that @one-armed dwarfmentioned, but the effect is incredibly subtle and, for me, is more to do with the high levels of art direction in the game. Having said that I'm on a pretty low end 4k TV, not some bells and whistles super monitor, so it could be that. Link to comment Share on other sites More sharing options...
DANGERMAN Posted November 6, 2024 Share Posted November 6, 2024 My memory is light sources like torches were what made my performance drop Link to comment Share on other sites More sharing options...
one-armed dwarf Posted November 6, 2024 Share Posted November 6, 2024 Funnily enough I mainly game on a 1440p monitor, KB&M is a bit awkward on the TV. RT at first was definitely one of those things where I struggled to understand the performance/quality trade off of it, but I find now when playing modern games with limited (or no) RT that the strange way objects glow and light bleeds into areas it should not disrupt the presentation for me. I remember TLOU2 being the first game I really felt that in, where the cutscenes looked amazing and then there was this drop off in visual consistency with its gameplay, with characters and objects appearing to 'float' cause they did not have as good shadows, ambient occlusion or shading as an RT enabled game would. In that sense RT is like a forbidden fruit when you get used to using it, cause I wouldn't ever have registered that as a visual 'flaw' with a game like TLOU2 if I hadn't played stuff like Control months before it. It grounds objects and characters together in a more coherent looking scene, though it has some tradeoffs in image quality and visual noise. It's just still early with this stuff I think Link to comment Share on other sites More sharing options...
Maf Posted November 23, 2024 Share Posted November 23, 2024 On 05/11/2024 at 20:48, Maf said: My point is have RT on, play for a bit so you’re used to it, then go to any kind of vista and turn the RT off. If ever want to know what the difference between RT/No RT, HDR/No HDR, Shadows on/off, etc. the absolute best way to do it is go from on>off. Menus that show the difference immediately so you can watch all the detail/colour disappear is even better. I haven’t done this with Witcher but open world games like Elden Ring, Spider-Man and Horizon are really good at showing the difference I was just messing with the Nvidia overlay (You can record up to 20mins of gameplay like the share button on PS5! GPU usage shoots up but I guess that's ok. Just need to remember what the fucking hotkey is) and I was trying it on the Witcher and I thought I would do my test RT No RT It seems to mostly be about Shadows to my eye, but the difference is night and day Anyway, I'm going back to Dragon Quest 3 on Switch the game is sick 1 Link to comment Share on other sites More sharing options...
one-armed dwarf Posted November 24, 2024 Share Posted November 24, 2024 One thing I'm finding I really don't like with nvidia is frame generation Visually, it's really impressive. I rarely notice issues But going between the Half Life games and STALKER 2, it's super noticeable how bad the latter feels with frame gen on. Half Life, super fast and responsive mouse movement. STALKER, feels so delayed It's made me very skeptical about their performance claims in their GPU marketing, even tho I was skeptical about some of that before. But you can't just show these FPS charts with all the DLSS suite enabled IMO and act like responsiveness is not also a big part of the feeling of smoothness. Yeah it runs at 120fps, but feels like you're playing with a laggy mouse Miss me with this heavy ass mouse movement, if I wanted that I'd put marmalade on the underneath of it Link to comment Share on other sites More sharing options...
one-armed dwarf Posted December 4, 2024 Share Posted December 4, 2024 NZXT getting raked over the coals for their basically usurious PC rental scheme/scam The TL;DW on it is that the repayments are not unlike illegal payday loans. On top of that they will downgrade the parts that you "dont own", which you are paying shitloads for. But it's an interesting video I think, I enjoy when Steve goes all in on companies like this. The most damning part to me is when he shows the tiktok influencers telling kids they could pay it off by winning Fortnite tournaments, which puts into perspective the people they are targetting here When my brother was building a PC I think NZXT is who I got the case off, but it sounds like they've fallen pretty far in recent years. 1 Link to comment Share on other sites More sharing options...
Maf Posted December 20, 2024 Share Posted December 20, 2024 Oh, no. I've used up my 2TB of storage on my PC. Lucky I have another 949gb left Link to comment Share on other sites More sharing options...
one-armed dwarf Posted January 7 Share Posted January 7 5090 and the rest of the series line got announced https://www.nvidia.com/en-us/geforce/news/dlss4-multi-frame-generation-ai-innovations/ Without getting too stuck in the weeds on it, the big selling point is DLSS4 it looks like. Better frame generation, or at least more frames. I don't know if they improved the latency but they say their newer FG has the same latency as the old one, with way more frames. It's cool that those are the promises they are making cause FG is the most underwhelming part of 40xx to me, cause of how it makes the mouse feeling all muddy which is awful for shooting games. Maybe the 60 or 70 series will be when they get to make the claim of native or near native responsiveness They're saying that the 5070 can do 4090 numbers but that's with the new DLSS suite enabled so a bit deceiving. Still tho, it's been a while since Nvidia made boasts about their mid-end products or even tried to compete there. Someone on resetera did some comparison and found that 5090 is about 25 to 30 percent more powerful than 4090. Which is about half the leap from 3090 to 4090 I think but that was an unusually large leap. That said, as more games adopt path tracing I would guess all the new DLSS stuff is going to become required to push that, on current PT the 4090 gets its butt kicked pretty soundly, Indiana Jones providing the most convincing butt kicking of them all, with how realistic its shadows and shading gets. So the difference would be a lot larger there 1 Link to comment Share on other sites More sharing options...
spatular Posted January 7 Share Posted January 7 Nice! Yeah I’ve been keeping an eye on the new graphics cards being announced and stuff too. Yeah it’ll be interesting to see if this new frame generation works as well as they say, I’ve never tried it as I have an older card but guess problems with it might not affect me as much playing on a pad. I was looking at maybe getting a 4070s or amd equivalent last year, but not been playing anything too demanding on pc recently so been pretty easy to wait it out for new stuff to appear, 5070 being 12gb is a shame, if it was 16 I think I’d definitely try to get one, that’s the problem at the price range I’m looking at, they do sound cool though, but still want to see what amd will offer with the 9070 although they haven’t said much about it yet, it should have 16gb though, might need a new psu for amd stuff though, would be good if they can improve efficiency on their side. Link to comment Share on other sites More sharing options...
one-armed dwarf Posted January 7 Share Posted January 7 I feel like the next upgrade I'd like to do is some sort of monitor but I'm still clueless as to what I want there, with how all the QDOLED stuff seems to burn in super fast. But playing Doom Eternal on my 1440 LED I'm just thinking what ultrawide and higher contrast could do for it, and 240hz omg. But it's such a goldilocks situation with monitors, they all have pros and cons and not insignificant ones Link to comment Share on other sites More sharing options...
spatular Posted January 7 Share Posted January 7 yeah i've been looking at possibly getting a new tv, do i want one with black smearing like my current one or possible burn in? great. i think i might want an ips panel but they don't seem to do those much in tv's with gaming/high refresh rate features. and then a lot of tv's don't come in smaller sizes than 55 which is not ideal Link to comment Share on other sites More sharing options...
Maf Posted January 7 Share Posted January 7 The 5080 is £1000 less than the 5090 so if I do upgrade (that’s considering they’ll probably sell out anyway, and if it would even fit in my PC) that’s the one I think I would get Link to comment Share on other sites More sharing options...
one-armed dwarf Posted January 7 Share Posted January 7 I'm fairly certain that the 5080 is a downgrade from a 4090. Certainly on the VRAM front anyway Link to comment Share on other sites More sharing options...
Maf Posted January 7 Share Posted January 7 Oh, hang on, I just checked again and I was looking at the 5080 vs the 4080. My bad, nevermind Link to comment Share on other sites More sharing options...
one-armed dwarf Posted January 7 Share Posted January 7 DF coming at you with syllable heavy technical language It is interesting that a number of DLSS4 features are actually coming to the older cards. Just not the new multi frame gen thing, which I'm not interested in anyway. But we are getting improvements to ray reconstruction, which looks really rough right now at times in Cyberpunk I mean, I don't need to play 2077 a fourth time, but that's pretty neat. I'm sort of glad to not be feeling the FOMO here. 1 Link to comment Share on other sites More sharing options...
Maf Posted January 7 Share Posted January 7 Yeah, the new DLSS technique to get even clearer image is more interesting to me than frame gen. So you can 3-4x my frame rate? My monitor only goes up to 144 and I cap most games at 120 anyway. The thing that will,push me is when they start recommending specs for games and the ultra @60 now says 5080 instead of 4080 I will be like oooh But these consoles aren’t going anywhere for another 3-4 years so I think my 4090 will still play PC games near the higher end of performance for a while yet (hopefully) Link to comment Share on other sites More sharing options...
DANGERMAN Posted January 7 Share Posted January 7 Maf if you buy a new card I'm calling the police 1 Link to comment Share on other sites More sharing options...
spatular Posted January 10 Share Posted January 10 or maybe Maf should buy all the graphics cards and become king of the north or something, i don't know how it works. (a 4090 will definitely be fine for a long long time) don't know if it's been mentioned on here but there was a new intel gpu recently which looked like a really good deal at around £250 but apparently doesn't work as well with older/slower cpu's so not as good as it first seemed, like this place re-reviewed it: Spoiler Link to comment Share on other sites More sharing options...
one-armed dwarf Posted January 10 Share Posted January 10 Yeah I heard the driver CPU overhead was really bad or something. On the 5090 I was watching a lot of the new nvidia stuff has me wondering if we're near the peak of hardware cause they are pushing the AI stuff so hard while it's also getting a lot harder to push hardware. In the past couple days I've seen a lot of people say the same thing I said about frame generation feeling weird. I understood it as an acceptable compromise to max out Cyberpunk but I play too many old games to ever get used to FG in new games. Spoiler So if that became like the norm or something, I dunno. It's pretty disappointing to me. I've seen some people say that the latency you get with frame gen is similar enough to the latency you get with it off, at its base framerate. But even at that, it's very disorientating getting 45fps latency when the game is running at 90. Everything feels less immediate, less accurate, like your reactions are being interfered with by something. It somehow makes the performance feel worse to me, with the illusion that it's actually improved. It's by far my least favorite aspect of Nvidia's suite of DLSS features and makes this latest product so unappealing cause it's more of it. It's impossible to explain the feel of it really, and if you play on a controller there's nowhere near as big an impact. But I like keyboard and mouse, and FG does not feel good with it at all imo. So I think generally speaking it seems Nvidia are making their products for controller players moreso I also disagree with John here when he talks about Cyberpunk not really being a good fit for a game with fast response, you can play that game like a janky version of Titanfall 2 if you want. Link to comment Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now