Poll shows 84% of PC users unwilling to pay extra for AI-enhanced hardware - VideoCardz.com
videocardz.com
external-link
Enthusiasts say no to paying extra for AI features TechPowerUP poll strongly suggests that there is no interest in AI among advanced PC users. Source: Wccftech A recent poll on TechPowerUp revealed that an overwhelming majority of PC users are not interested in paying extra for hardware with AI capabilities. According to the survey, 84% […]
@[email protected]
link
fedilink
57
edit-2
2Y

I’m generally opposed to anything that involves buying new hardware. This isn’t the 1980s. Computers are powerful as fuck. Stop making software that barely runs on them. If they can’t make ai more efficient then fuck it. If they can’t make game graphics good without a minimum of a $1000 gpu that produces as much heat as a space heater, maybe we need to go back to 2000s era 3d. There is absolutely no point in making graphics more photorealistic than maybe Skyrim. The route they’re going is not sustainable.

@[email protected]
link
fedilink
25
edit-2
2Y

The point of software like DLSS is to run stuff better on computers with worse specs than what you’d normally need to run a game as that quality. There’s plenty of AI tech that can actually improve experiences and saying that Skyrim graphics are the absolute max we as humanity “need” or “should want” is a weird take ¯\_(ツ)_/¯

We should have stopped with Mario 64. Everything else has been an abomination.

warm
link
fedilink
11
edit-2
2Y

The quality of games has dropped a lot, they make them fast and as long as it can just about reach 60fps at 720p they release it. Hardware is insane these days, the games mostly look the same as they did 10 years ago (Skyrim never looked amazing for 2011. BF3, Crysis 2, Forza, Arkham City etc. came out then too), but the performance of them has dropped significantly.

I don’t want DLSS and I refuse to buy a game that relies on upscaling to have any meaningful performance. Everything should be over 120fps at this point, way over. But people accept the shit and buy the games up anyway, so nothing is going to change.

The point is, we would rather have games looking like Skyrim with great performance vs ‘4K RTX real time raytracing ultra AI realistic graphics wow!’ at 60fps.

NekuSoul
link
fedilink
7
edit-2
2Y

The quality of games has dropped a lot, they make them fast

Isn’t the public opinion that games take way too long to make nowadays? They certainly don’t make them fast anymore.

As for the rest, I also can’t really agree. IMO, graphics have taken a huge jump in recent years, even outside of RT. Lighting, texture quality shaders, as well as object density and variety have been getting a noticeable bump. Other than the occasional dud and awful shader compilation stutter that has plagued many PC games over the last few years (but is getting more awareness now) I’d argue that game performance is pretty good for most games right now.

That’s why I see techniques like DLSS/FSR/XeSS/TSR not as crutch, but as just as one of the dozen other rendering shortcuts game engines have accumulated over the years. That said, it’s not often we see a new technique deliver such a big performance boost while having almost no visual impact.

Also, who decided that ‘we’ would rather have games looking like Skyrim? While I do like high FPS very much, I also do like shiny graphics with all the bells and whistles. A Game like ‘The Talos Principle 2’ for example does hammer the GPU quite a bit on its highest settings, but it certainly delivers in the graphics department. So much so that I’ve probably spent as much time admiring the highly detailed environments as I did actually solving the puzzles.

warm
link
fedilink
2
edit-2
2Y

Isn’t the public opinion that games take way too long to make nowadays? They certainly don’t make them fast anymore.

I think the problem here is that they announce them way too early, so people are waiting like 2-3 years for it. It’s better if they are developed behind the scenes and ‘surprise’ announced a few months prior to launch.

Graphics have advanced of course, but it’s become diminishing returns and now a lot of games have resorted to spamming post-processing effects and implementing as much foliage and fog as possible to try and make the games look better. I always bring Destiny 2 up in this conversation, because the game looks great, runs great and the graphical fidelity is amazing - no blur but no rough edges. Versus like any UE game which have terrible TAA, if you disable it then everything is jagged and aliased.

DLSS etc are defo a crutch and they are designed as one (originally for real-time raytracing), hence the better versions requiring new hardware. Games shouldn’t be relying on them and their trade-offs are not worth it if you have average modern hardware where the games should just run well natively.

It’s not so much us wanting specifically Skyrim, maybe that one guy, but just an extreme example I guess to put the point across. It’s obviously all subjective, making things shiny obviously attracts peoples eyes during marketing.

NekuSoul
link
fedilink
3
edit-2
2Y

I see. That I can mostly agree with. I really don’t like the temporal artifacts that come with TAA either, though it’s not a deal-breaker for me if the game hides it well.

A few tidbits I’d like to note though:

they announce them way too early, so people are waiting like 2-3 years for it.

Agree. It’s kind of insane how early some games are being announced in advance. That said, 2-3 years back then was the time it took for a game to get a sequel. Nowadays you often have to wait an entire console-cycle for a sequel to come out instead of getting a trilogy of games on during one.

Games shouldn’t be relying on them and their trade-offs are not worth it

Which trade-offs are you alluding to? Assuming a halfway decent implementation, DLSS 2+ in particular often yields a better image quality than even native resolution with no visible artifacts, so I turn it on even if my GPU can handle a game just fine, even if just to save a few watts.

warm
link
fedilink
22Y

Which trade-offs are you alluding to? Assuming a halfway decent implementation, DLSS 2+ in particular often yields a better image quality than even native resolution with no visible artifacts, so I turn it on even if my GPU can handle a game just fine, even if just to save a few watts.

Trade-offs being the artifacts, while not that noticable to most, I did try it and anything in fast motion does suffer. Another being the hardware requirement. I don’t mind it existing, I just don’t think mid-high end setups should ever have to enable it for a good experience (well, what I personally consider a good experience :D).

As with any proprietary hardware on a GPU it all comes down to third party software support and classically if the market isn’t there then it’s not supported.

Assuming theres no catch-on after 3-4 cycles I’d say the tech is either not mature enough, too expensive with too little results or (as you said) theres generally no interest in that.

Maybe it needs a bit of marturing and a re-introduction at a later point.

Not even on my phone

JokeDeity
link
fedilink
72Y

The other 26% were bots answering.

@[email protected]
link
fedilink
17
edit-2
2Y

deleted by creator

capital
link
fedilink
132Y

My old ass GTX 1060 runs some of the open source language models. I imagine the more recent cards would handle them easily.

What’s the “AI” hardware supposed to do that any gamer with recent hardware can’t?

Run it faster.
A CPU can also compute graphics but you wait significant more time than using hardware accelerated graphics hardware.

Bro, just add it to the pile of rubbish over there next to the 3D movies and curved TVs

BlackLaZoR
link
fedilink
42Y

Unless you’re doing music or graphics design there’s no usecase. And if you do, you probably have high end GPU anyway

DarkThoughts
link
fedilink
32Y

I could see use for local text gen, but that apparently eats quite a bit more than what desktop PCs could offer if you want to have some actually good results & speed. Generally though, I’d rather want separate extension cards for this. Making it part of other processors is just going to increase their price, even for those who have no use for it.

BlackLaZoR
link
fedilink
12Y

There are local models for text gen - not as good as chatGPT but at the same time they’re uncensored - so it may or may not be useful

DarkThoughts
link
fedilink
22Y

Yes, I know - that’s my point. But you need the necessary hardware to run those models in a performative way. Waiting a minute to produce some vaguely relevant gibberish is not going to be of much use. You could also use generative text for other applications, such as video game NPCs, especially all those otherwise useless drones you see in a lot of open world titles could gain a lot of depth.

Nora
link
fedilink
302Y

I was recently looking for a new laptop and I actively avoided laptops with AI features.

cheee
link
fedilink
182Y

Look, me too, but, the average punter on the street just looks at AI new features and goes OK sure give it to me. Tell them about the dodgy shit that goes with AI and you’ll probably get a shrug at most

The dedicated TPM chip is already being used for side-channel attacks. A new processor running arbitrary code would be a black hat’s wet dream.

It’s not a full CPU. It’s more limited than GPU.

That’s why I wrote “processor” and not CPU.

@[email protected]
link
fedilink
1
edit-2
2Y

A processor that isn’t Turing complete isn’t a security problem like the TPM you referenced. A TPM includes a CPU. If a processor is Turing complete it’s called a CPU.

Is it Turing complete? I don’t know. I haven’t seen block diagrams that show the computational units have their own cpu.

CPUs also have co processer to speed up floating point operations. That doesn’t necessarily make it a security problem.

Do you have an article on that handy? I like reading about side channel and timing attacks.

TPM-FAIL from 2019. It affects Intel fTPM and some dedicated TPM chips: link

The latest (at the moment) UEFI vulnerability, UEFIcanhazbufferoverflow is also related to, but not directly caused by, TPM on Intel systems: link

That’s insane. How can they be doing security hardware and leave a timing attack in there?

Thank you for those links, really interesting stuff.

It will be.

IoT devices are already getting owned at staggering rates. Adding a learning model that currently cannot be secured is absolutely going to happen, and going to cause a whole new large batch of breaches.

The “s” in IoT stands for “security”

Only 7% say they would pay more, which to my mind is the percentage of respondents who have no idea what “AI” in its current bullshit context even is

flicker
link
fedilink
72Y

I figure they’re those “early adopters” who buy the New Thing! as soon as it comes out, whether they need it or not, whether it’s garbage or not, because they want to be seen as on the cutting edge of technology.

Or they know a guy named Al and got confused. ;)

A man walks down the street He says why am I short of attention Got a short little span of attention And woe my nights are so long

Maybe I’m in the minority here, but I’d gladly pay more for Weird Al enhanced hardware.

lost_faith
link
fedilink
22Y

Hardware breaks into a parody of whatever you are doing

Me - laughing and vibing

The other 16% do not know what AI is or try to sell it. A combination of both is possible. And likely.

And the other 16% would also pay you $230 to hit them in the face with a shovel

@[email protected]
link
fedilink
-13
edit-2
1M

deleted by creator

You like having to pay more for AI?

I feel like the sarcasm was pretty obvious in that comment, but maybe I’m missing something.

Tbh this is probably for things like DLSS, captions, etc. Not necessarily for chatbots or generative art.

Create a post

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
  • 1 user online
  • 140 users / day
  • 380 users / week
  • 1.2K users / month
  • 3.1K users / 6 months
  • 1 subscriber
  • 6.98K Posts
  • 55.5K Comments
  • Modlog