

10·
22 days agoI would pickup an inference GPU to run models locally. I can see a benefit in that especially if they’re on the cheap


I would pickup an inference GPU to run models locally. I can see a benefit in that especially if they’re on the cheap


Read his multipart on arguing with AI boosters. He covers silly arguments like this.
Also to paraphrase Cory Doctorow, you’re not going to keep breeding these mares to run faster and then one day they’ll birth a locomotive…
The bar is full of “nutrittton” how could it be AI? 😂


Thank God I bought the GOTY on sale years ago and it looks great with a few tweaks and mods…

Funny thing is these don’t meet ANSI Z89.1 and therefore can’t be used in most workplaces
The amount of people saying no, Facebook not bad in this thread screams shills abound