For PC gaming news and discussion.
PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
- 1 user online
- 156 users / day
- 448 users / week
- 1.2K users / month
- 3.37K users / 6 months
- 1 subscriber
- 5.01K Posts
- 33.8K Comments
- Modlog
I’m not talking about sentience per se, but how any “AI” would think, lookups (LLMs), vs synthesized on-the-fly thinking (mimicing the human brain’s procesing).
This comment is licensed under CC BY-NC-SA 4.0
Hrmm. I guess i don’t believe the idea that you can make a game that really connects on an empathic, emotional level without having those experiences as the author. Anything short and you’re just copying the motions of sentiment, which brings us back to the same plagerism problem with LLMs and othrr “AI” models. It’s fine for CoD 57, but for it to have new ideas we need to give it one because it is definitionally not creative. Even hallucinations are just bad calculations on the source. Though they could insire someone to have a new idea, which i might argue is their only artistic purpose beyond simple tooling.
I thoroughly believe machines should be doing labor to improve the human conditon so we can make art. Even making a “fun” game requires an understanding of experience. A simulacrum is the opposite, soulless at best. (In the artistic sense)
If you did consider a sentient machine, my ethics would then develop an imperative to treat it as such. I’ll take a sledge hammer to a printer, but I’m going to show an animal care and respect.