but at the cost of competitive fairness??? No thanks.
You do realize current CoD is matchmaking based on engagement, right?
If it takes 3 matches of getting stomped to exit the game, it’ll give you an easy win every 3 games.
If the first number changes, so does the second.
That’s not even getting into how buying skins artificially reduces the second number.
Like, after accounting for just those two things, it’s kind of absurd to complain about fairness unless it’s competitive mode, and I’m pretty sure those already have pc/console splits.
Player input uniformity is one of my biggest gaming gear grinding situations. It’s especially problematic with my favorite genre of games: competitive FPS games.
That’s how it was split…
A PC controller would be matched up like a console controller. Now pc controllers will be grouped with PC mouse players…
You typed a lot but don’t seem to know what you’re talking about about bud
Yet with the sudden closure of Monolith Productions and cancellation of its Wonder Woman project, that particular success seems doomed to life in the vault. Though the Nemesis system was due to be implemented in the Wonder Woman game, Warner Bros. still owns the patent for the system, which stands with an expiration date in 2036.
The fact that a game mechanic can be patented for over 20 years is fucking insane…
What did avowed do wrong?
That’s what I’m talking about…
It’s not that they do anything wrong, it’s that the don’t do anything well.
It was less an RPG and more just a fps shooter with the same bare minimum of RPG elements as every other fps shooter.
The biggest games are the blandest because they’re trying to appeal to everyone. But even people not into RPGs can like a good rpg. Which is why BG3 and CK2 KCD2 are so popular.
Big studios aren’t underestimating their audience, they’re just counting on video game enthusiasts buying every major game.
I ran thru it on game pass, and it was alright, but I was kind of just going thru the motions. So I said “terrible” but I meant it was terrible at being a big budget triple A RPG.
It felt like a generic fps from 20 years ago, the only flavor was from existing IP pasted over it.
It’s not just games, it’s TV and movies too.
Cramming sex into everything made sense when we weren’t constantly 5 seconds away from being able to see pretty much any kind of porn someone could imagine.
Now tho, that stuff isn’t a reason to watch a show. If you just cared about that, you’d watch the clip online
Ding also apparently wants NetEase to focus on games like the multiplayer mobile game Eggy Party
Oh great, another remake…
Rather than playing on a high-end PC, I spent most of my time with KCD2 using medium settings on a mainstream build, complete with a Ryzen 5 3600 CPU and RTX 4060 or RTX 4070 Super graphics card. Even with the older and/or mid-range components, frame-times remained solid when targeting 60fps.
I have no idea what they’re doing if a 4070 super only got 60fps on medium… I was getting 50-70s on 4k ultra with quality upscaling and no frame gen
Or why it’s “older and/or mid-range”.
Like, you can’t buy a 5000 series if you wanted to, a 4070 super is better than what most gamers are using today
The author is aware that there’s a big bias where reviewers have the best rigs, but they’re drastically out of touch with what most people are really using
The gold standard in games retaining playerbases in this genre is Baldur’s Gate 3, which still puts up over 100,000 players nightly, though it released just over a year and a half ago.
“Released”…
Going off memory but I think BG3’s act 1 was out before cyberpunk or close to the same time. I had 100+ hours before the game released, and it was with huge gaps of play time. I think it was in Beta for at least two years.
The x3ds are the best for gaming… Which means overkill if not using a higher end gpu.
But it depends on what gpu and motherboard you have, then if you’re just trying upgrade CPU or build new.
Tomshardware isn’t as good as it used to be, but they do a good job with their comparisons for cpus and gpus.
If you’re not changing your GPU anytime soon, use an online calculator to find out what CPU matches it so you don’t bottleneck. A little either way isn’t a big deal.
pretend like I said anywhere that ALL I do with it is game,
No, I said no one should buy this for gaming…
No one should buy this for gaming…
You’ve misunderstood everything I said, and now you’re taking an attitude because you don’t know what’s happening. Which honestly is a very normal reaction, it’s just the threshold that’s surprising.
Me explaining anything to you is just going to be frustrating on both sides, so I’ll take the easy step to make sure it doesn’t happen again.
You said that buying a CPU like mine, with a slightly smaller cache, was a bad idea specifically for gaming.
For the same reason buying a Camaro to sit in traffic would be a bad idea…
You have one, and it’s going to be fine for gaming.
But for someone who has yet to buy one, they could pay less and get better gaming performance from an x3d.
If it still doesn’t make sense, I’d advise asking someone else
I don’t think I’ve seen the cpu ever come close to maxing out
You bought a Camaro for an hour long daily commute stuck in traffic.
It’s not going to be bad, it’s just if all you do is game then you’re not using what you paid for. You could be getting the same performance for much less price and electricity.
Gaming performance might be a little worse than an x3d (probably not), but you’d need a crazy card for a cpu bottleneck so it doesn’t matter.
Lately I’ve used it with is Baldur’s Gate 3
Yeah, turn based CRPGs is the only time I turn that stuff on.
But there seems to be two types of frame generation:
A real frame is generated, then it guesses, then real frame is generated.
Two real frames are generated, then they “fill in the gaps” with fake frames.
For 1 it doesn’t create lag, but it may guess wrong then have to correct on the next real frame.
For 2 the fake frames are going to be correct, but a lot of latency is added.
Neither is an issue for CRPG, but can be a huge issue in shooters, fighting games, and PvE like Dark Souls.
The problem is manufactures are now counting all the fake frames in their stats while ignoring every issue that pops up from that.
It’s a good point tho, DC is compact and designed that way with a solid lore reason.
NA should be more spread out lore wise, and it is.
The big mistake was having the navigation pathing direct you to fast travel station to move about the same map. So if you tried to walk places in NA, it was likely to make you walk to the closest fast travel point, even if it’s in the opposite direction. Which prevents players from learning the map and getting lost when they realize what’s happening and don’t want the loading screen for fast travel.
Like, I agree with the guy, but it’s not a design problem, it’s a pathing problem.
Saw 4xxx and thought this was huge.
4060 TI is a year old and $400 on release. And they used an outdated test that doesn’t get benefits from the 4xxx series GPUs like hardware ray tracing or frame generation.
Factor in that stuff, and $250 seems like a sensible price point for people who don’t want/need that stuff.
I got as far as seeing it on a store shelf
It’s on shelves?
Most paid games don’t have a physical release because it adds costs these days, it’s surprising they have physical copies.
Is it just a free disc that tells the hardware to download it? Or some kind of collector’s edition with extra stuff?
But give it a try, quick play is quick play and you won’t get a good team comp, but I got to silver in ranked and people know what they’re doing most of the time. You won’t always get two tanks, but two heals and a tank is the worst I’ve seen.
Eh, it’s not really a “deck builder” like people think.
Like, it sounds weird because there’s literally cards and you select a deck for each player…
But just move past the cards/deck and think of it as a loadout and selecting what abilities you want each character to have. And the upgrade system really lets you fine tune what abilities you can use.
It’s a small piece of the gameplay, but the randomness it forces rather than just always using OP moves gives it a lot of replayability.
So, I don’t think the card mechanic was a problem other than turning people off before they tried it. I think it went free on PSN a while ago, and I was really hoping it would make it take off.
from what I’m TOLD was a decent game, but didn’t go anywhere:
It’s an amazing game.
The cards were a great way to handle combat, it was just a lot of new ideas, and the story parts slowed it down. If running around the abbey was something that could be turned off as an option and everything handled on a menu splash screen it would have done even better.
But my ‘okay’ computer is handling the game at max settings just fine.
Yeah, that’s the issue.
Your comp is running maxed setting at what you consider a serviceable framerate, while admitting your PC is just “okay”.
Everyone with a better comp than you, is also running at max setting, and seeing the graphics you are at probably close to the same average frames and dips. But we’re used to better graphics at higher frame rates with zero stutter/dips.
I’ve talked about this issue in the past, and it’s hard to explain. But a properly optimized game shouldn’t really run with everything maxed out on release except the very top hardware setup.
What’s currently max setting should be “medium” settings, because lots of people can handle it.
Your experience wouldn’t change at all, there’d just be the higher graphical settings available for people who could run them.
Think of it like buying the game on PS5 Pro, and then finding out that it plays exactly the same on the PS4. It’s not that you’d be mad that the PS4 people get a playable version, it’s that you don’t understand why that’s comparable to the newest gen console version. And compared to games that use your PS5 pro’s full power, it’s going to seem bad.
People (myself included) just assumed since it was UE5, they’d be at least giving us the options that UE5 was updated to support.
It seems they did it for future proofing the game, which 100% makes sense. Hopefully they add that stuff in with updates later.
Like, it doesn’t support hardware ray tracing…
And it doesn’t have non ray based lighting either. It forces everything to software ray tracing, which is a huge performance hit to people with hardware that can do ray tracing, but is completely unnoticeable to people with hardware that can’t do ray tracing. They may even see better graphics than a game that uses traditional lighting.
Like. I’m just a hobbyist nerd, I don’t really know all the in and outs of what’s going on with Stalker 2. But it seems like this is just a game that caters to the average PC gamer to the point everyone with an above average PC wasn’t even an afterthought.
I’m sure there’s going to be a lot of people who know more than me looking at lot closer at why the reaction to this game has been so varied.
3090 and 5800X3D.
Yeah. I’m 4070 super and 7800x3d
Like I said, I went in expecting it to look like Senue’s 2 on boot. And there was just no reason for me to have done that.
I’ll give it a month or so and then mess with settings/drivers/etc and it’ll probably be fine. It’s just even when I tried turning stuff down I was having issues, but I haven’t put a lot of effort into getting it right.
Just because the engine is capable of crazy stuff, doesn’t mean every game will push it to its full potential, and that’s fine. That’s how engines last for a long time and that’s good for all of us in the long run.
That’s what I mean.
Everybody had unrealistic expectations, myself included.
My PC isn’t a slouch, but everybody who got early play has top of the line shit and there’s a large discrepancy in PC hardware these days.
Apparently it’s not shaders, but I had to check what resolution it was at thinking it was throwing 720 by default or something. With everything cranked to 4k and only the normal performance hogs off the highest settings it looked bad. 1080p with everything down still had stuttering tho.
I didn’t put much effort in and my experience was launch day.
So people should definitely try for themselves if they have it from Xbox for PC for free…
I just expected it to be amazing on boot when I shouldn’t have.
Hugely disappointed in Stalker 2…
But after that article I’ll give it another shot sooner than I was going to. I never thought that horrible performance could have been shaders loading in the background.
If that’s what was going on, then they really need to make that more obvious, or lock people in a sort of training area until it’s done and then start the actual game.
A couple weeks and it’ll probably be a lot better.
But initial thoughts before the article, I think the mistake was watching huge budget games designed from the ground up to be a showcase for the engine, and assuming that would be what any third party studio could crank out.
UE5 has amazing potential, but it still needs good code run on good hardware to get Selene’s result.
That would mean a lot more if it wasn’t in the same reply chain as you clearly not knowing what you’re talking about about…