The big reason I switched back to Nvidia was because I wanted to play with some local AI models, and doing that with AMD cards was quite difficult at the time (I think it’s improved a little, but still isn’t straightforward).
I’ve tried to run a few minimal 8B and even 1.3B models on AMD cards and they are such trash that my CPU can run them faster. Why did they code the xformers in python to only be compatible with Nvidia drivers?
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
No game suggestions, friend requests, surveys, or begging.
No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
No off-topic posts/comments, within reason.
Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
The big reason I switched back to Nvidia was because I wanted to play with some local AI models, and doing that with AMD cards was quite difficult at the time (I think it’s improved a little, but still isn’t straightforward).
I’ve tried to run a few minimal 8B and even 1.3B models on AMD cards and they are such trash that my CPU can run them faster. Why did they code the xformers in python to only be compatible with Nvidia drivers?