Absolutely bizarre that a 1st party title doesn’t seem optimized for the console they’re developing for. This makes me skeptical the PC version will be optimized too.
Absolutely bizarre that a 1st party title doesn’t seem optimized for the console they’re developing for. This makes me skeptical the PC version will be optimized too.
Might just be my middle-aged eyes, but I recently went from a 75Hz monitor to a 160Hz one and I’ll be damned if I can see the difference in motion. Granted that don’t play much in the way of twitch-style shooters anymore, but for me the threshold of visual smoothness is closer to 60Hz than whatever bonkers 240Hz+ refresh rates that current OLEDs are pushing.
I’ll agree that 30fps is pretty marginal for any sort of action gameplay, though historically console players have been more forgiving of mediocre performance in service of more eye candy.
Are you sure you have the reset rate set correctly on your video card? The difference between 75hz and 160hz is very clear just by moving your mouse cursor around. Age shouldn’t have anything to do with it.
Quite sure – and given that one game I’ve been playing lately (and the exception to the lack of shooters in my portfolio) is Selaco, so I ought to have noticed by now.
There’s a very slight difference in smoothness when I’m rapidly waving a mouse cursor waving around on one screen versus the other, but it’s hardly the night-and-day difference that going from 30fsp to 60fps was back in Ye Olden Days, and watching a small, fast-moving, high-contrast object doesn’t make up the bulk of gameplay in anything I play these days.
If the two are beside eachother, you’ll definitely see the difference.
The old one and the new one are literally side by side on my desktop, don’t know what to tell you…
Hmm, I’ve found it quite noticeable. Perhaps turn an FPS counter on and see what it’s actually running at. If you have a game showing on both screens, it’ll likely limit the fps to suit the lowest display hz.
This is a good point, a lot of people just assume plugging it in gets the hz, but a lot of the time you have to select the hz in your settings.
Games feel almost disgusting on 60hz now, but they felt fine before I tried 144hz.
Maybe if I was stuck at 60hz for a long time id get used to it.
Now though, if I switch for 30m I can’t ignore the difference.
Do you primarily game on mouse or controller?
Mouse, mostly. I’ve noticed that I feel lag much much more with mouse.
A 160hz refresh rate gives the software a 6ms render budget, do things actually even run at that rate?
If your comp is good enough absolutely. Strong PCs now can run sub 5ms frame times at 4k pretty regularly. Especially for competitive games that aren’t designed to look incredible.
I can confirm 3-5ms frametimes with a popular shooter at 165hz.
It really depends what one’s doing, also. For many things, including many games, 30fps is fine for me. But I need at least 60fps for mousing. Beyond that though I don’t notice the mouse getting smoother above 60fps, but some games I do have a better experience at 120fps. And I’m absolutely sold on 500+ fps for simulating paper.
That’s weird. I’m getting to the age where I wouldn’t see the point in 4k, I’d need to have my head on top of the screen to see it. But refresh rate can be felt in fluid scrolling etc and definitely even if only on the unconcious level, improves awareness in games too.