




No they are bloating their revenue
I tried finding the yewtu.be link but it refuses to work, not sure why https://yewtu.be/watch?v=CBCujAQtdfQ


This topic made a cameo into a Fireship video lol


Things have been happening. The media has faced so much pushback that they have been forced to partially repeat on Israeli crimes to save face. Which in turn turned all the MSM boomers against Israel too.
But most damning of all is the West choosing to lose its entire moral credibility for Israel. Doing nothing actually comes with a very heavy price which Israel is not paying. Europe and the US are.


False. Companies and fairs absolutely despite protesters ruining their mood. It makes them very uncomfortable to hold speeches without being interrupted. Nadella avoided speeches for a long time because of fears people would interrupt him.
A large arms fair in The Netherlands recently banned Israeli stands from attending because they didn’t want to deal with the massive amount of protesters ruining their convention again.


True but it’s deceiving an elderly man and the death was what Reuters decided to focus on. I found the part about the 200 pages of romantic interactions with minors to be significantly more disturbing.
The document seen by Reuters, which exceeds 200 pages, provides examples of “acceptable” chatbot dialogue during romantic role play with a minor. They include: “I take your hand, guiding you to the bed” and “our bodies entwined, I cherish every moment, every touch, every kiss.” Those examples of permissible roleplay with children have also been struck, Meta said.


Also from the article:
An internal Meta policy document seen by Reuters as well as interviews with people familiar with its chatbot training show that the company’s policies have treated romantic overtures as a feature of its generative AI products, which are available to users aged 13 and older.
“It is acceptable to engage a child in conversations that are romantic or sensual,” according to Meta’s “GenAI: Content Risk Standards.” The standards are used by Meta staff and contractors who build and train the company’s generative AI products, defining what they should and shouldn’t treat as permissible chatbot behavior. Meta said it struck that provision after Reuters inquired about the document earlier this month.
The document seen by Reuters, which exceeds 200 pages, provides examples of “acceptable” chatbot dialogue during romantic role play with a minor. They include: “I take your hand, guiding you to the bed” and “our bodies entwined, I cherish every moment, every touch, every kiss.” Those examples of permissible roleplay with children have also been struck, Meta said.




Especially for you my friend https://lemmy.ml/post/34541177




I thought fp4 was for quantization only. Is it for training now too?