r00ty
link
fedilink
81M

“8gb ought to be enough for anybody”

Einar
link
fedilink
English
61M

I wish.

Send one of these guys by my place. I’ll show them what 8GB can not do…

@[email protected]
link
fedilink
English
31M

deleted by creator

@[email protected]
link
fedilink
English
2
edit-2
1M

Ive got 16gb of vram 2k monitor and this tracks pretty accurately. I almost never use over 8gb. The only games that I can break 10gb are games where I can enable a setting (designed for old PCs) where I can load all the textures into vram.

@[email protected]
link
fedilink
English
31M

Weird. You must be playing old games. Most modern games are going over 8gb at 1440p no problem. They have been for at least a few years now.

@[email protected]
link
fedilink
English
01M

deleted by creator

@[email protected]
link
fedilink
English
1
edit-2
1M

Ksp uses ram, not vram. I play rp1 with 8gb vram no problem. 32 gb of ram isn’t enough though.

@[email protected]
link
fedilink
English
11M

Yep, confused it.

@[email protected]
link
fedilink
English
21M

I don’t think I’ve ever seen a game use more RAM than ksp with mods though, holy moly.

@[email protected]
link
fedilink
English
1
edit-2
1M

Cyberpunk with 4K texture packs has entered the chat.

Edit: also the AI upscaled textures pack for starfield. Also the official 4k texture pack for Warhammer: Space Marines 2. All go over 16gb vram, even at 1440p.

@[email protected]
link
fedilink
English
31M

My Rimworld at 500+ mods can be pretty fucked

@[email protected]
link
fedilink
English
11M

I would agree because 8gb is entry for desktop gaming and most people start at entry level

mintiefresh
link
fedilink
English
211M

Lmao. AMD out here fumbling a lay up.

@[email protected]
creator
link
fedilink
English
4
edit-2
1M

I mean honestly, yeah. With a simple 4 GB chip they could have won the low end and not screwed over gamers.

They’ve really seemed to have forgotten their roots with the GPU market, which is a damn shame.

BombOmOm
link
fedilink
English
16
edit-2
1M

Seriously.

All AMD had to do here is create a 12GB and 16GB version (instead of 8 and 16), then gesture at all the reviews calling the RTX 5060 8GB DOA because of the very limiting VRAM quantity.

8GB VRAM is not enough for most people. Even 1080p gaming is pushing the limits of an 8GB card. And this is all made worse when you consider people will have these cards for years to come.

Image (and many more) thanks to Hardware Unboxed testing

@[email protected]
link
fedilink
English
81M

Even worse when you consider the cost difference between 8GB and 16GB can’t be that high. If they ate the cost difference and marketed 16GB as the new “floor” for a quality card, then they might have eaten NVIDIA’s lunch where they can (low-end)

@[email protected]
link
fedilink
English
31M

Exactly. Even if you accept their argument that 8GB is usually enough today for 1080P (and we all know that is only true for high performance e-sports focused titles), it is not true for tomorrow. That makes buying one of those cards today a really poor investment.

@[email protected]
link
fedilink
English
191M

Then put 8GB in a 9060 non-XT and sell it for $200. You’re just wasting dies that could’ve been used to make more 16GB cards available (or at least a 12 GB version instead of 8).

@[email protected]
link
fedilink
English
-11
edit-2
1M

That wouldn’t work. AMD uses a lot of low memory cheap cheap memory in unison to achieve high speeds, that’s why their cards have more vram than nvidia, not because the amount matters, but because more memory chips together can get higher speeds.

Nvidia uses really expensive chips that are high speed so they can fewer memory chips to get the same memory speed.

Then AMD lied and manipulated gamers for advertising that you need 16gb vram.

Memory speed > memory amount

@[email protected]
link
fedilink
English
41M

Why would speed matter more then amount. If I have to swap from the slower system memory it’s going to slow it down. Having more means I can store more needed stuff in fast ram.

@[email protected]
link
fedilink
English
-12
edit-2
1M

That’s not how it works at all. You still need to use system memory.

And honestly, I really don’t have time to explain all the details on how RAM and VRAM works.

You are definitely AMDs target audience.

@[email protected]
link
fedilink
English
61M

You’re wrong bro. I can’t er, don’t have time to explain but you’re just wrong. *Finishes with passive aggressive insult*

You’re fresh from reddit, ay? Ever consider going back?

@[email protected]
link
fedilink
English
-151M

removed by mod

@[email protected]
link
fedilink
English
41M

Wow. Moi, a dumb cunt? You suuuure got me! I hope you didn’t hurt yourself coming up with such an epic zinger. True big brain material there, really shows us all how much mental horsepower you’ve got under the hood.

Fucking toxic manchild Redditors. Looking forward to seeing your account inactive after a few weeks of us mocking you.

@[email protected]
link
fedilink
English
-51M

I personally think anything over 1080p is a waste of resolution, and I still use a card with 8GB of VRAM.

That being said, lots of other people want a 16GB card, so let them give you money AMD!

@[email protected]
link
fedilink
English
31M

1440p on a 27" monitor is the best resolution for work and for gaming.

@[email protected]
link
fedilink
English
31M

anything over 1080p is a waste of resolution

For games, maybe.

But I also use my PC for work (programming). I can’t afford two, and don’t really need them.

At home I’ve got a WQHD 1440p monitor, which leaves plenty of space for code while having the solution explorer, watch window, and whatnot still open.

At work we’re just given cheap refurbished 1080p crap, which is downright painful to work with and has often made me consider buying a proper monitor and bringing it to work, just to make those ~8h/day somewhat less unbearable.

So I can’t go back to 1080p, and have to run my games at 1440p (and upscaling looks like shit, so no).

My gaming rig is also my media center hooked up to a 4k television. I sit around 7 feet away from it. Anything less than 1440p looks grainy and blocky on my display.

I can’t game at 4k because of hardware limitations (a 3070 just can’t push it at good framerates) but I wouldn’t say it’s a waste to go above 1080p, use case is an important factor.

My TV has this stupid bullshit where it’s only 30hz at 1440p but is 60hz at literally every other resolution (including 4K). 😬

@[email protected]
link
fedilink
English
-1
edit-2
1M

It looks grainy because it’s a damn TV and not a monitor. You’re not going to be able to tell the difference AT THE DISTANCE that you’re supposed to be using them at. Larger monitors are meant to be used from a farther distance away. TVs are meant to be used from across the room.

You’re that guy with his retina plastered on the glass of his smartphone going “I CAN SEE THE PIXELS!”

@[email protected]
link
fedilink
English
21M

Pixel density is pixel density. Doesn’t matter if it’s a tv or a monitor.

Sure monitors typically have less input lag and there are reasons one might choose a monitor over a tv, but the reverse is also true. I chose a 55" tv for my sim racing setup that sits maybe a meter from my face and there’s no problem with that setup

@[email protected]
link
fedilink
English
01M

TV panels have lower PPI than monitors.

@[email protected]
link
fedilink
English
11M

Not sure what you think PPI means or how it’s calculated, but it has nothing to do with being a tv or a monitor. It’s a relationship between the number of pixels and physical size.

A 34" 1440p monitor will have a lower PPI than a 4k TV at the same size

Is there a reason you were so hostile with your repsonse?

Second, according to this site which I referenced at the time of purchase for my TV, I’m at the appropriate distance for my screen size of 55 inches. The image is grainy at 1080p because a 4k screen has WAY more pixels to stretch the image over so at the recommended distance for a 4k screen you end up with a blocky image with chunky pixels. It’s fine, it’s not like its unplayable, but why would I do that when I can get just as good an experience (30hz display can only get pushed so hard) at 2k without overwhelming my hardware and have a better image as well?

I’m not a Hardcore gamer, I’m not trying to get 9000+ fps. I mostly play tetris and my ps1 on a crt. I want my games to look the way they’re intended to, they’re art projects and I like to respect them as such. Ergo, I play them at the highest resolution my hardware can support.

@[email protected]
link
fedilink
English
2
edit-2
1M

I personally think anything over 1080p is a waste of resolution

But but Nvidia said at the RTX 3000 announcement that we can now have 8K gaming

kbal
link
fedilink
11M

If he’d chosen his words more carefully and said “many” rather than “most” nobody would have a reason to disagree.

@[email protected]
link
fedilink
English
71M

Tell that to my triple 1440p screen flight simulator!

@[email protected]
link
fedilink
English
71M

Have you tried buying three graphics cards?

@[email protected]
link
fedilink
English
41M

This video I just watched the other day says otherwise (with clear evidence.)

https://youtu.be/C0_4aCiORzE

@[email protected]
link
fedilink
English
51M

He is only testing AAA games at top settings. And that’s the point AMD is “making”. Most pc gamers are out there playing Esport titles at the lowest possible settings in 1080p to get the max fps possible. They’re not wrong, but you could still say that it’s ridiculous to buy a brand-new modern card only expecting to run esport titles. Most people I know that buy modern GPUs will decide to play new hot games.

@[email protected]
link
fedilink
English
111M

I just ditched my 8gb card because it wasn’t doing the trick well enough at 1080p and especially not at 1440p.

So if i get this straight AMD agrees that they need to optimize games better.

I hate upscaling and frame gen with a passion, it never feels right and often looks messy too.

First descendant became a 480p mess when there were a bunch of enemies even tho i have a 24gb card and pretty decent pc to accompany that.

I’m now back to heavily modded Skyrim and damn do i love the lack of upscaling and frame gen. The Oblivion stutters were a nightmare and made me ditch the game within 10 hours.

@[email protected]
link
fedilink
English
1
edit-2
1M

FSR4 appears to solve a lot of problems with both upscaling and frame gen – not just in FSR, but generally. It appears they’ve fixed disocclusion trails, which is a problem even DLSS suffers from.

fox2263
link
fedilink
English
181M

Do you just not want more money?

Nvidia have dropped the ball epically and you have a golden opportunity to regain some GPU share here.

@[email protected]
link
fedilink
English
71M

Oh so it’s not that many players are FORCED to play at 1080p because AMDs and Novideos “affordable” garbage can’t cope with anything more to make a game seem smooth, or better yet the game detected we’re running on a calculator here so it took pity on us and set the graphics bar low.

@[email protected]
link
fedilink
English
21M

Hey, give a little credit to our public schools (poorly-optimized eye-candy) new games! (where 10-20GiB is now considered small)

Tell that to game developers. Specifically the ones that routinely don’t optimize shit.

@[email protected]
link
fedilink
English
121M

Or to gamers who insist on playing these unoptimized games at max settings. $80 for the game, and then spend $1000 buying a gpu that can run the game.

Create a post

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
  • 1 user online
  • 12 users / day
  • 223 users / week
  • 848 users / month
  • 3.26K users / 6 months
  • 1 subscriber
  • 5.86K Posts
  • 40.6K Comments
  • Modlog