This smells like investor-baiting. Studios don’t really need to announce that they’re going “aggressive” in using a certain tool.
This smells like investor-baiting. Studios don’t really need to announce that they’re going “aggressive” in using a certain tool.
Minecraft. I wanted host a dedicated Minecraft server so I rented a VPS and needed a free, lightweight OS. I’ve been tinkering with Linux ever since.
I love Linux and Windows, I wouldn’t trade one for the other.
I paid for Overwatch too but tbh I’d rather have Overwatch 2. There’s no loot box bullcrap and the playerbase now is actually pretty sizeable now compared to OW1 near the end.
I do miss the free rewards but they’re just cosmetics. A bigger playerbase is more beneficial to the game than more free skins.
Overwatch 2 is great so it’s not all bad.
Why do you want OW1 back?
Nvidia: Creates a whole new app for driver updates and ShadowPlay.
Also Nvidia: Creates the most half-assed GPU tuner in the app.
Seriously I can’t even undervolt without a third party app. Once PyTorch + ROCm comes to Windows I’m switching back to AMD.
If I set the resolution to 1024x768 and the graphics to Low but the FPS is still the same, something is wrong.
I love Starfield and has been playing it every day since launch. It runs like dogshit. Sure it doesn’t stutter or anything but I can’t, for the life of me, get the average FPS in outdoor areas to be anything higher than 70. 5800X + 3080 Ti. It doesn’t matter how much I lower the setting, the CPU overhead is crazy.
In New Atlantis City outdoors? Mine barely stays above 60 FPS, sometimes dipping under.
Except this time even with 1024x768 and lowest settings you can barely break 60 FPS due to the huge CPU overhead.
And that’s with a Ryzen 7 5800X.
Only if you run out of VRAM. If there’s sufficient VRAM the frame rate barely changes between Lowest and Highest texture quality.
The game might be much more CPU bound on Nvidia cards. Probably due to shitty Nvidia drivers.
I have a 5800X paired with a 3080 Ti and I can’t get my frame rate to go any higher than 60s in cities.
60 FPS is quite smooth and playable but far from perfectly smooth. There’s still noticeable juddering on continuous camera motion.
Bullshit, there’s no “standard” FPS for a certain genre. Also the 3080 Ti is a $1200 last gen GPU and the 5800X is a $450 last gen CPU. It’s ridiculous that they can’t even push 100+ FPS at the lowest settings. The CPU overhead in this game is insane. I used to target 120 FPS minimum for all games I play, hence the high-end build, but now even 90 FPS is too much? lmao
How about people with a Ryzen 5 5600 and RTX 3060 that wants to play at 60 FPS? Keep in mind that we’re not talking about 120 FPS, just measly 60 FPS and those parts are barely 2 years old.
Nah 60 is not good enough for me. I’m fine if it’s a mobile game or handheld. I have no problems getting 90 FPS minimum in A Plague Tale: Requiem and Cyberpunk 2077.
In Starfield, not even 720p with lowest settings will help because the game is very heavily dependent on CPU. Looking at HW Unboxed benchmarks, the 5800X only managed to do 57 FPS average. You need a 7800X3D or a 13600K to get 90 FPS average.
Exactly my point. I want 90 FPS at least and lowering the settings didn’t help at all.
I have a PC with 5800X, 3080 Ti, and 64 GB DDR4-3600. I play at 1440p with 80% render scale, Medium-High settings (mostly Medium) and it’s barely above 60 FPS outdoors. It runs like shit.
Luckily it can go 140+ FPS indoors.
This is actually ultra rare Tim Sweeney W.
No need to act like it’s breaking copyright laws when in it’s current state it’s not even defined.
It’s on Game Pass if you’re on PC/Xbox and your country is covered. Feels good to be able to play a triple A game for “free” tbh
Are we defending/justifying toxicity now?