Sony hate thread

Nope, fully 3d, the game has no pre-rendered backgrounds. Its a 2006 release so its pushing the PS2 to the limits of course though.
That's really impressive then. I do remember those games being praised for their graphics at the time.
 
Majora's Mask is praised now but was one of the first divisive Zelda games, and it didn't do much to push the series ahead outside of some gimmicks. Personally I like it more in theory than in actuality, the time limit shit just sours everything and I'm not alone in thinking that. I think that stuff effected sales. The fanbase is pretty finicky, Wind Waker is praised now too but all it took to turn people off at the time was the art.
Agreed, I don't tend to enjoy time limits since I like going through every nook and cranny. That I managed to fuck up a dungeon before finishing it and had to rewind killed my interest then and there to keep engaging. Wind Walker was a much better experience, but even then, the map was too big for it's own good and then I hit the triforce section and laziness won.
Imma take simple 1080p and let my display stretch that to 2K.
Agreed, I have played through steamlink and docked my deck on my 4K tv and things look fine to me (well, the link had shitty signal days, but that's another issue...). Does that mean I have 0 standards? Very likely, but thankfully I haven't trained my eye to give a shit.
 
Also, literally fucking how? I thought PS5 and Series X can't even manage 4K without some dedicated mode that knocks everything to 30FPS, much in the same way that PS4 Pro couldn't handle it without a dedicated 30fps mode.
I'm far from an expert, but the expected Switch version of 4k is some fake 4k, upscaled or something, which supposedly while not being a true 4k display is close enough for many people.
The general consensus is that the Switch 2 will have Nvidia DLSS and will use that to upscale or achieve 4K.
 
  • Informative
Reactions: Vyse Inglebard
To further elaborate. To match a 60hz CRT display you would need around 1000hz on a sample and hold display because of how the phosphors on a CRT work. Also you need a high resolution, 4k or above, 8k is the ideal, in order to properly render a simulation of the slot mask and phosphors. Simulating a CRT properly is no easy task and its also GPU intensive.

If you really want to play your games the way they were intended then try and find a CRT, they are pretty cheap if you aren't looking for one of the meme ones that all the redditors are after because of that stupid digital foundry video. The motion clarity on a good CRT is still unrivaled even compared to OLEDs.
I think they'll get there within a decade. Higher end models are approaching 500Hz, though 8k for a monitor might take a while to approach a reasonable price point. Sadly don't have enough room for a dual monitor setup, let alone a crt. But I am happy with my current setup for emulation, so I can wait.
 
  • Thunk-Provoking
Reactions: Vyse Inglebard
Exactly right. Games like Metal Gear Solid 2 or Final Fantasy X were a big deal because they represented a huge graphical leap that was evident to even the normiest normie.

View attachment 6055095

But nowadays? I'd really have to look hard to see any difference at all between eighth and ninth-gen games. And I'm a nerd who's into this kind of shit.
I've been replaying a bunch of old western games on the Nintendo Switch lately, such as Bioshock 1+2, Red Dead Redemption and Batman Arkham Asylum/City, and they still hold up pretty well graphically despite their initial release date range between 13~17 years ago (time sure flies by).

2024060312353600_c.jpg 2024060118234900_c.jpg
2023120221245900_c.jpg 2023120300021700_c.jpg

But it was also a time where artistic integrity wasn't an abandoned concept yet with the leap in the HD generation, so things were still aesthetically pleasing to look at.
 
Last edited:
I've been replaying a bunch of old western games on the Nintendo Switch lately, such as Bioshock 1+2, Red Dead Redemption and Batman Arkham Asylum/City, and they still hold up pretty well graphically despite their initial release date range between 13~17 years ago (time sure flies by).
Yeah, I tried out Red Dead Redemption on a Switch emulator a few weeks ago and still found it to be very visually pleasing. Does RDR2 look better? Sure. Does the difference substantially improve my experience? Does the increased visual fidelity increase the emotional impact of the story or the immersive quality of the world or how fun the gameplay is? I don't think so.

Maybe that's just part and parcel of becoming old (did people in their 30s say "eh, whatever" when the Playstation 2 launched?), but based on the fact that 10+ year old games like Grand Theft Auto V still have very large player bases even among Zoomers, I think it's a deeper phenomenon than that. As I've said before, everything looks good enough and it's been that way for quite some time.
 
Does RDR2 look better? Sure. Does the difference substantially improve my experience? Does the increased visual fidelity increase the emotional impact of the story or the immersive quality of the world or how fun the gameplay is? I don't think so.
A big problem with RDR2 is all the time spent on stupid shit.

"These NPC's dig *actual* holes! These NPC's *actually* eat all the food on their plate! These NPC's *actually* cut the tree down!"

It's the worst part of "realism" in gaming. I wonder how much of RDR2's development was spent on bullshit that you only notice if you actively stop playing the game and just stand around to even notice?

4K resolution is nigger technology, amd this philosophy on designing shit that does absolutely nothing to enhance the gaming experience is nigger game/world design.

And I like RDR2, but I never once stopped to watch someone eat their whole plate of food, hit the tree with the axe 10 times to cut the tree down or stand around or 10 minutes while someone dug a ditch.
 
The 8k thing could be seen as false advertisement. Do you think that some one could sue them for advertising things the machine is not capable of?
 
Back