This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
GPT-5 without “thinking” mode got the answer wrong.
GPT-5 with thinking answered:
Here are the 21 US states with the letter “R” in their name:
Arizona, Arkansas, California, Colorado, Delaware, Florida, Georgia, Maryland, Missouri, Nebraska, New Hampshire, New Jersey, New York, North Carolina, North Dakota, Oregon, Rhode Island, South Carolina, Vermont, Virginia, West Virginia.
It wrote a script that verified it while doing the “thinking” (feeding the hallusinations back to the LLM)
deleted by creator
Well, not quite, because they don’t have criteria for ‘right’.
They do basically say ‘generate 10x more content than usual, then dispose of 90% of it’, and that surprisingly seems to largely improve results, but at no point is it ‘grading’ the result.
Some people have bothered to provide ‘chain of thought’ examples and even when it’s largely ‘correct’, you may see a middle step be utterly flubbed in a way that should have fouled the whole thing, but the error is oddly isolated and doesn’t carry forward into the subsequent content, as would be the case in actual ‘reasoning’.