Sony hate thread

  • 🏰 The Fediverse is up. If you know, you know.
  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
IMO we've already crossed that finish line. We're now at the point where the victory celebration is over but the racer is still doing laps to keep the hype/attention going when meanwhile everyone went home about an hour ago.

UE5 form a visual standpoint is not doing anything different that UE4 couldn't do, mocap is churning out hideous uncanny valley models, and ray-tracing is breaking people's computers by making the game just flat-out not work even on gaming rigs that are supposed to be built for graphic intensive shit.
Games need to focus on optimization, performance and artistic integrity instead of graphics now. That was the PS3's first mistake: focusing on power that cannot be translated well into optical performance.
 
I'm not sure if things are stagnating but it feels like it across the board with games. It seems like graphics are getting about as good as they can get, same with hardware requirements, new tech like DLSS and Unreal Engine's Nanite would make rendering graphics easier, theoretically. So far it just seems to be a race to see how many pixels can fit on a screen, but there comes a point of diminishing returns even there. This is the same trend I see everywhere. It's like culture is slowing down in many ways.

Maybe we're on the way to the finish line of the graphics race, similar to toasters, there's just no way to optimize it with reasonable resource costs.
Its hitting the point of diminishing returns with TVs already. The process of creating 8k TVs has had a lotbof setbacks because of the small size of the LED diodes and how hard it is to quality control on such a small scale. It will likely become financially unfeasible to go much farther beyond 8k for the average consumer and video games will likely stop there as well.

If they popularize VR to the mass market you could move in that direction potentially but I have my doubt's that consumers are honestly interested in the (((fully immersive experience))) reporters, YouTubers and other video game people tout. I think most people treat video games as an interactive movie or a time wasting activity. I don't think a large enough percentage of people are looking to feel like they are in the game.
 
Speaking of mocap, I'm seeing a lot of indie films starting to use UE for animation. I guess it makes animation a lot easier vs rigging and doing it all manually.
Epic has also essentially all but stopped updating the actual game dev part of Unreal now. All major updates since like 4.20 have focused mostly on virtual production, only becoming more of a focus with each release. They aren't doing anything with Fortnite that requires them to actually update the engine, so its stagnating and they are only bothering to update it for hollywood, since AAA studios just modify the engine to suite their needs. The last time there was any actual major work done to the engine was back when Paragon was a thing, they were working on a 'gameplay ability system' that they just abandoned half completed. It's kind of killing indie games that use Unreal.

They also offer no LTS at all, so if you want bug fixes without breaking everything by updating to a new version (or if you don't want the horrible performance of UE5) you are out of luck. Best hope you can brave the massive source code and make a custom build.
 
Epic has also essentially all but stopped updating the actual game dev part of Unreal now. All major updates since like 4.20 have focused mostly on virtual production, only becoming more of a focus with each release. They aren't doing anything with Fortnite that requires them to actually update the engine, so its stagnating and they are only bothering to update it for hollywood, since AAA studios just modify the engine to suite their needs. The last time there was any actual major work done to the engine was back when Paragon was a thing, they were working on a 'gameplay ability system' that they just abandoned half completed. It's kind of killing indie games that use Unreal.

They also offer no LTS at all, so if you want bug fixes without breaking everything by updating to a new version (or if you don't want the horrible performance of UE5) you are out of luck. Best hope you can brave the massive source code and make a custom build.
Huh never knew Epic has such poor support for their main creative product. Perhaps we will see Blender's use in not just indie films but also in games start to rise.
 
Huh never knew Epic has such poor support for their main creative product. Perhaps we will see Blender's use in not just indie films but also in games start to rise.
Blender dropped the game engine when they re-wrote the renderer with 2.8. It wasn't really suited for large projects though. I think Godot is getting better, but it's still not ideal for a lot of cases, and it's closer to Unity than Unreal. I'm sure its great for programmers, but for creatives/artists who can't really program, unfortunately the Blueprints in Unreal are almost mandatory.

A lot of AAA companies have actually started using Blender for art now though, Ubisoft donated a bunch to the dev fund, I think Bungie said their entire hard surface team is using Blender now. Blender is under a gig and opens instantly, and the re-writes done in the last few years have made it lightning fast compared to Maya or 3ds, and no other software has anything like Eevee.
 
IMO we've already crossed that finish line. We're now at the point where the victory celebration is over but the racer is still doing laps to keep the hype/attention going when meanwhile everyone went home about an hour ago.

UE5 form a visual standpoint is not doing anything different that UE4 couldn't do, mocap is churning out hideous uncanny valley models, and ray-tracing is breaking people's computers by making the game just flat-out not work even on gaming rigs that are supposed to be built for graphic intensive shit.
Can confirm. I've tried ray tracing in the tiny few games I have that support it, and I cannot tell the difference at all. It only seems to serve to tank my framerate. I hate it.

If they popularize VR to the mass market you could move in that direction potentially but I have my doubt's that consumers are honestly interested in the (((fully immersive experience))) reporters, YouTubers and other video game people tout. I think most people treat video games as an interactive movie or a time wasting activity. I don't think a large enough percentage of people are looking to feel like they are in the game.
I firmly believe VR will never be anything but niche. There are way too many hurdles. It's got everything working against it, such as high system requirements, very few must-haves, and the headset itself is just uncomfortable. It tugs on my hair. And then you've got people who get motion sickness, or have other health problems, or just not enough space in their homes, the list goes on. It's ridiculous that Sony's focusing on PSVR2, but, whatever. I'd much rather see a third PlayStation Portable, and for them to actually advertise it this time.
 
Can confirm. I've tried ray tracing in the tiny few games I have that support it, and I cannot tell the difference at all. It only seems to serve to tank my framerate. I hate it.
Raytracing's only practical use is for developers to bake their scenes faster. Real time raytracing for games is stupid and only serves to make developer's lives easier by removing the need for them to properly set up baked lighting (UVs, texture resolution etc). If games actually need dynamic lighting, there are rasterization techniques that provide global illumination and the like for MUCH cheaper (Voxel GI, dating back to 2014). All it serves is an excuse for developers to cheap out on optimization more.

I got an RTX card because I do work in Blender and Unreal. I use it to preview baked lighting in real time, but I still bake it. It cut my render time from hours down to minutes. For creative work real time raytracing is a godsend, but for games its a gimmick at best.
 
And then you've got people who get motion sickness, or have other health problems, or just not enough space in their homes, the list goes on. It's ridiculous that Sony's focusing on PSVR2, but, whatever.
Sounds like they're banking on VR as MS was banking on the Kinect despite the minimum requirements to even USE the dongle is impractical for many consumers.

Raytracing's only practical use is for developers to bake their scenes faster. Real time raytracing for games is stupid and only serves to make developer's lives easier by removing the need for them to properly set up baked lighting (UVs, texture resolution etc).
Is raytracing just technical jargon for improved light detection or reflection? What is even the appeal of it?
 
Sounds like they're banking on VR as MS was banking on the Kinect despite the minimum requirements to even USE the dongle is impractical for many consumers.
Even Kinect makes more sense than PSVR, it can be used passively for voice commands, and if it were, you know, not awful, swiping through the air could have eventually worked well. I actually liked the Kinect and saw a lot of potential in it, so it's a shame it just petered out.

I've thought about making a thread about Kinect and where it is today, but it just wouldn't be very interesting. If you care at all, there's actually a new Kinect that just kind of... exists. It's not for Xbox, it's barely for the home, it's, uh... well, the official website comes off like "Here's a neat camera that can do all sorts of cool stuff. What's it for? I dunno, you figure it out. Try exercising in front of it, put it in your workshop, I dunno. Here's a big pile of software development kits, good luck"
 
Is raytracing just technical jargon for improved light detection or reflection? What is even the appeal of it?
Raytracing is the simulation of light rays (a simplification of light, since including waves would mean quantum physics). Rays are fired from the light sources and each surface the ray hits contributes to the calculated colour (my description is awful, but there are videos that explain it properly). It's been used in CGI for decades because it was too computationally expensive to run in real time. Because of that games had to 'fake it' with rasterization techniques (in raytracing, each pixel is the combined calculated colour per ray per pixel, while rasterization is simplified math that approximates lighting through vector math and then 'rasterized' to pixels).

Using raytracing means that the lighting is much more realistic, and because it's real time and takes into account every pixel in a scene, it is affected by everything around it. It's the most realistic type of rendering because it is a simulation of light.

Disney video about path tracing (modern implementation of raytracing, they are the same thing, just a more advanced and highly technical reason for having a different name)
 
Even Kinect makes more sense than PSVR, it can be used passively for voice commands, and if it were, you know, not awful, swiping through the air could have eventually worked well. I actually liked the Kinect and saw a lot of potential in it, so it's a shame it just petered out.
I loved my Kinect as well. I wish more games utilized its resources and MS handled it better with the Xbox One. In hindsight, Kinect is more accessible than VR. No controllers, just stand in front of the sensor.

Rare, Harmonix, and Double Fine were the only ones that knew what Kinect was capable of and gave a damn about their products. I'm not counting Ubisoft just because of Just Dance.
 
I've thought about making a thread about Kinect and where it is today, but it just wouldn't be very interesting. If you care at all, there's actually a new Kinect that just kind of... exists. It's not for Xbox, it's barely for the home, it's, uh... well, the official website comes off like "Here's a neat camera that can do all sorts of cool stuff. What's it for? I dunno, you figure it out. Try exercising in front of it, put it in your workshop, I dunno. Here's a big pile of software development kits, good luck"
I think the 'new' kinect uses the same hardware as the Xbone kinect, just slimmed down. I actually use my 360 kinect and Xbone kinect at the same time to get surprisingly decent motion capture. They are also pretty good for basic 3d scanning without shelling out hundreds of dollars for a LIDAR camera.
 
I think the 'new' kinect uses the same hardware as the Xbone kinect, just slimmed down. I actually use my 360 kinect and Xbone kinect at the same time to get surprisingly decent motion capture. They are also pretty good for basic 3d scanning without shelling out hundreds of dollars for a LIDAR camera.
Yeah, I picked up a cheap Kinect 2.0 but never got around to buying the USB adapter. I've wanted to play around with 3D scanning, but man plans and God laughs.
 
I swear, I think when Rocksteady went to WB and pitched a Superman game (which is what they should have fucking made in the first place, not this pile of shit), I bet WB came back and said "nah we don't have any Superman movies coming out but we got more Suicide Squad, so go make a Suicide Squad game, we can monetize the shit out of it." A dev like Rocksteady that made 3 incredible games doesn't just turn into shit that quick, WB has to be heavily interfering. Deadshot being black instead of white like in Arkham City is evidence of this, I think they wanted him to resemble Will Smith.
WB announced a Suicide Squad game back in 2010, before Arkham City even came out. They spent years trying to find anyone who would actually create the damn thing. RockSteady wanted to make Superman more than anything, but WB didn’t like their pitch for a seemingly de-powered Man of Steel and finally found a studio to pawn this shit show off on.

I'm not sure if things are stagnating but it feels like it across the board with games. It seems like graphics are getting about as good as they can get, same with hardware requirements, new tech like DLSS and Unreal Engine's Nanite would make rendering graphics easier, theoretically. So far it just seems to be a race to see how many pixels can fit on a screen, but there comes a point of diminishing returns even there. This is the same trend I see everywhere. It's like culture is slowing down in many ways.
Personally, I think the race for realism has now stagnated, but I don’t think graphics have. Video games seem to be hitting the same rut as CGI animation. The 2000s was an arms race for the best and most realistic tech, followed by the 2010s being better, but sort of stagnant. Then Spider-Verse came along and now the game has shifted to pushing tech to be more stylized. Modern games with realistic styles look like shit as they can never fully perfect it, but honestly the more cartoon designed titles seem to really benefit from the PS5 power. Personally the best looking titles I think the 8th and 9th gens have are Arkham Knight, which is a heavily comic book stylized realism tbh, and something like Ratchet, which again, is a cartoon styled realism. Not many good references currently, but I feel like a painted style or straight comic book like Puss or Spider-Verse have for animation, could honestly bring out the best in current tech.
 
I think the 'new' kinect uses the same hardware as the Xbone kinect, just slimmed down. I actually use my 360 kinect and Xbone kinect at the same time to get surprisingly decent motion capture. They are also pretty good for basic 3d scanning without shelling out hundreds of dollars for a LIDAR camera.
It was good enough for airports.
 
"Here's a neat camera that can do all sorts of cool stuff. What's it for? I dunno, you figure it out. Try exercising in front of it, put it in your workshop, I dunno. Here's a big pile of software development kits, good luck"
I read an article few years ago that said the Kinect had begun to take off in medical settings, but fuck me if I remember how or why.
 
They are also pretty good for basic 3d scanning without shelling out hundreds of dollars for a LIDAR camera.
you know you can use six webcams for that, right?
other than that it's true that kinetic has some really good scanning on it, hopefully they were able to upgrade this tech and slimmer it down a notch, by the time i had money to buy a kinect they went out of production and i'm not buying second hand shit.
 
It was too computationally heavy to use for many offline renders as well unless they were stills. I think Monsters. Inc was the first time Pixar used raytracing.

Realtime software rendered RT from 2003:
https://youtube.com/watch?v=XS4FqNTO2xA
Shadows done with CGI are usually raytraced I believe, at least with mid to late 90s CGI, it's much cheaper to calculate a simple boolean for shadows with no bounce lighting (sharp shadows, soft shadows are much much more expensive to raytrace, so those were probably PCF filtered shadowmaps for a while), that's probably what that video is. I think that shadow mapping was still used for a time though. The video also looks like it uses only parametric models, so it's much cheaper to calculate.

I looked up the Pixar thing, its pretty interesting how they rendered movies before Monster University (not Inc). They used something called 'Reyes Rendering' which is really just a Gouraud renderer, but each object is sliced into micropolygons the size of a pixel, so the per-vertex shading limitation of Gouraud is overcome, but it's still (relative for the time) cheap to render.

Apparently they only added actual raytracing to renderman for Cars, but it was a hybrid voxel technique that was partly baked out, for reflections. Monsters University was the first time they used a path traced GI solution.
 
Back
Top Bottom