Too many AI language models are just word salad. It will spit out very long responses that add nothing of substance. Sometimes it’s kind of like a high schooler desperately trying to reach the paragraph requirement on an essay.
My favorite example so far was when a person asked a car dealer chatbot intended to talk to customers about their cars to write a python script, and it complied.
In economics, the Jevons paradox occurs when technological progress increases the efficiency with which a resource is used (reducing the amount necessary for any one use), but the falling cost of use induces increases in demand enough that resource use is increased, rather than reduced.
This technology is so goofy that the simple solution might be to prompt with a story that ends right before the NPC says something. Doesn’t necessarily have to be a different story per-character, or even change much beyond appending that character’s dialog and yours. If you feed an LLM most of a chapter from The Hobbit and then end the prompt at “Then Thorin said,” you’re very likely to get some sentences that are in-theme and even in-character.
Telling the machine what to do, as abstract directions, suffers from very silly errors. Like how “draw a room with absolutely no elephants” will predictably draw a room with a high positive number of elephants. The great thing about this technology is how it works kinda like how human intelligence works. Too bad we have no goddamn idea how human intelligence works.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
News oriented content (general reviews, previews or retrospectives allowed).
Broad discussion posts (preferably not only about a specific game).
No humor/memes etc…
No affiliate links
No advertising.
No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
No self promotion.
No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
No politics.
Comments.
No personal attacks.
Obey instance rules.
No low effort comments(one or two words, emoji etc…)
Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
For starters:
“Fewer.”
Too many AI language models are just word salad. It will spit out very long responses that add nothing of substance. Sometimes it’s kind of like a high schooler desperately trying to reach the paragraph requirement on an essay.
My favorite example so far was when a person asked a car dealer chatbot intended to talk to customers about their cars to write a python script, and it complied.
I can believe it.
I don’t believe but okay.
Why does the NPC audio not match the text in the ‘tutorial’ video?
I mean, that’d definitely end basic one line dialogue when used, but i feel like it’ll introduce a different, probably worse, issue.
This technology is so goofy that the simple solution might be to prompt with a story that ends right before the NPC says something. Doesn’t necessarily have to be a different story per-character, or even change much beyond appending that character’s dialog and yours. If you feed an LLM most of a chapter from The Hobbit and then end the prompt at “Then Thorin said,” you’re very likely to get some sentences that are in-theme and even in-character.
Telling the machine what to do, as abstract directions, suffers from very silly errors. Like how “draw a room with absolutely no elephants” will predictably draw a room with a high positive number of elephants. The great thing about this technology is how it works kinda like how human intelligence works. Too bad we have no goddamn idea how human intelligence works.