It’s not enshittification because it literally doesn’t follow the second part of your own definition. Needing to change your offerings because your internal prices increase is normal business. Enshittification literally is from companies offering stuff to entice users and then they realize they have nothing else to offer to businesses, so they remove features in order to sell them to businesses or to increase ads.
Just because someone claims something to sue a company does not mean it’s true. You gotta go through the whole court process and prove it.
It says Valve “forces” game publishers to sign up to so-called price parity obligations, preventing titles being sold at cheaper prices on rival platforms
I’ve never seen any publisher claim this, but that doesn’t mean it isn’t true. But it sure doesn’t sound like that has anything to do with being a monopoly. Epic, GoG, Ubisoft, etc. could all do the exact same thing.
Anyway, thanks for the link. I was not the one to downvote you on your last comment. You did what I asked.
These companies still don’t understand why Nintendo has such a strangle on the handheld market. No one playing on handheld cares about performance benchmarks. They care if the game works and is fun. And they care about a good user experience.
Valve nailed it with proton, so that’s why Linux gaining is increasing so quickly. If you force Windows, it doesn’t matter if you get a hundred more frames a second. You’re gonna have a shitty user experience.
I informed my SecOps team and they reached out to Slack. Slack posted an update:
We’ve released the following response on X/Twitter/LinkedIn:
To clarify, Slack has platform-level machine-learning models for things like channel and emoji recommendations and search results. And yes, customers can exclude their data from helping train those (non-generative) ML models. Customer data belongs to the customer. We do not build or train these models in such a way that they could learn, memorize, or be able to reproduce some part of customer data. Our privacy principles applicable to search, learning, and AI are available here: https://slack.com/trust/data-management/privacy-principles
Slack AI – which is our generative AI experience natively built in Slack – is a separately purchased add-on that uses Large Language Models (LLMs) but does not train those LLMs on customer data. Because Slack AI hosts the models on its own infrastructure, your data remains in your control and exclusively for your organization’s use. It never leaves Slack’s trust boundary and no third parties, including the model vendor, will have access to it. You can read more about how we’ve built Slack AI to be secure and private here: https://slack.engineering/how-we-built-slack-ai-to-be-secure-and-private/
Never even heard of this game