• 0 Posts
  • 19 Comments
Joined 1Y ago
cake
Cake day: Aug 20, 2023

help-circle
rss

Here’s a link to the gameplay reveal so people can see what you’re talking about:

https://youtu.be/CTNwHShylIg?si=ebVtoc-xD7eVMOjX

The art style and tone looks much better in this than the weird trailer, but the gameplay looks closer to like mass effect 2 than dragon age origins. Probably gonna skip this one.


Same here. 5800x3d is great, and I’d rather not buy a new motherboard and things just yet.


The 7950 has 16 cores. I think what the article is suggesting is the very top of the line in the next gen could go potentially double, up to 32. I would imagine if that happened though that the more midline ones would still be in the 12-16 core range. I guess we’ll see when they come out though.


I feel your pain, it’s just such a popular cpu it never seems to come down as much as you’d think. Kind of the end game for anyone who doesn’t want a new motherboard.


Personally if I already had a 5800x I probably wouldn’t upgrade to the 3d, though there would likely be some gains, especially if you’re cpu bound on a game.

Here’s like 40 games where they’re compared on a nividia 3090:

https://www.techspot.com/review/2451-ryzen-5800x3d-vs-ryzen-5800x/

I upgraded from a 3700x to a 5800x3d so there was a big boost.

But a 5800x3d isn’t even much cheaper than a 7800x3d, and the socket type switched now. So if I already had a 5800x I’d probably just wait and switch to a 3d chip in the future when I was ready to upgrade my motherboard. If cost is no object and you’re not gonna swap the motherboard for a long time, then yes it’s the best gaming cpu you’re going to be able to use with that board and likely always will be.


That the eye can only perceive 24 fps is a myth. Plus vision perception is very complicated with many different processes, and your eyes and brain don’t strictly perceive things in frames per second. 24 fps is a relatively arbitrary number picked by the early movie industry, to make sure it would stay a good amount above 16 fps (below this you lose the persistence of motion illusion) without wasting too much more film, and is just a nice easily divisible number. The difference between higher frame rates is quite obvious. Just go grab any older pc game to make sure you can get a high frame rate, then cap it to not go higher to 24 after that, and the difference is night and day. Tons of people complaining about how much they hated the look of Hobbit movie with its 48 fps film can attest to this as well. You certainly do start to get some diminishing returns the higher fps you get though. Movies can be shot to deliberately avoid quick camera movements and other things that wouldn’t do well at 24 fps, but video games don’t always have that luxury. For an rpg or something sure 30 fps is probably fine. But fighting, action, racing, anything with a lot of movement or especially quick movements of the camera starts to feel pretty bad at 30 compared to 60.


These loot boxes are merely a highly artistic statement on the uncertainties in life and a run away capalitalistic society! We are as shocked as anyone that people have got addicted and lost thousands of dollars to our uhhhhh art, yeah.


I don’t think it’s particularly gpu intensive like you’d expect for a graphically intense game, there’s a heavy cpu bottleneck due to Npc calculations, some have suggested due to a lot of physics calculations with npcs. The npcs also have severe pop in issues in the city. For most people playing this the gpu isn’t going to be the issue. Even the most powerful gaming cpus are only able to take it so far in its current state though.


Bring in the guy that fixed the first one with dark arisen please.


I will be pleasantly surprised if this ends up being decent considering the prolonged development hell it has been in.



This game’s development is going real fast!

In comparison to the elder scrolls vi, lol


Yeah I wasn’t ready to swap out my whole motherboard and got a 5800x3d. A little on the pricier side still (~$320), but many games really love that extra large cache. Should hopefully keep me going for quite a while before having to upgrade sockets. There’s cheaper options than that that would still be a good upgrade, a 5700x is about $170. A couple games recently like baldurs gate 3 have been very cpu intensive.


They haven’t done it yet. It seems to be the natural order of these subscription services though. I worry it’s only a matter of time.


Why can’t we just own games anymore? Sure it’d be cool to have your service available on all devices. But once it reaches a critical mass and kills off competitors and other ways of getting games, expect enshittification to ensue, subscriber costs and advertising going way up. Just look at what’s happening with every tv/movie streaming platform now. I’m guessing games you can only access via game pass and can’t purchase separately at all are going to be coming at some point too.


There seems to be a bug with the main star/sun not showing up in some amd cards. I don’t recall if Xbox has this glitch, but it does use an amd card as well so seems possible. Hopefully Bethesda or amd or whoever is responsible for the bug can fix it soon.


I’m not sure, been trying to find the answer. But FSR3 they’ve stated will continue to be open source and prior versions have supported Vulkan on the developer end. It sounds like this is a solution for using it in games that didn’t necessarily integrate it though? So it might be separate. Unclear.


They’ve also stated fsr3 will continue to be open source, and previous versions have been compatible with Vulkan on the developer end at least. I can’t find though if this new hyper rx application running it agnostic to any developer integration is supporting Vulkan though. Guess we’ll find out when it’s released shortly here.


I hope this works out and becomes a viable competitor to DLSS3, especially with this most recent generation of games getting so demanding spec wise. I also appreciate that they make it available for any graphics card from any company. Nvidia certainly has some edge in propiatary features that AMD is having trouble matching at the moment, but Nvidia becoming even more dominant is bad news. Lack of competition will only encourage them to stagnate in the future and increase prices even higher. I’ll probably be looking to upgrade my own gpu soon so am very interested in how the just announced amd 7800xt compares against the Nvidia 4070.