If you understand Virtue Epistomology (VE), you cannot accept any LLM output as "information".
VE is an attempt to correct the various omniscience-problems inherent in classical epistemologies, which all to some extent require a person to know what the Truth is in order to evaluate if some statement is true.
VE prescribes that we should look to how the information was obtained, particularly in two ways:
1) Was the information obtained using a well-understood method that is known to produce good results?
2) Does the method appear to have been applied correctly in this particular case?
LLM output always fails on pt1. An LLM will not look for the truth. It will just look for what is a probable combination of words. This means that an LLM is just as likely to combine a number of true statements in a way that is probable but false, as it is to combine them in a way that is probable and true.
LLMs only sample the probability of word combinations. It doesn't understand the input, and it doesn't understand its own output.
Only a damned fool would use it for anything, ever.
#epistemology #LLM #generativeAI #ArtificialIntelligence #ArtificialStupidity @philosophy
I asked AI to show me an "antique pornograph."
This is what it showed me.
It's sort of underwhelming, but then again, it took 60 years for cars to go from buggies to the Jaguar XK-E.
Eventually, enough absurdity can cause AI to lose its damn mind. Then you put that outburst where another AI can slurp it up and later tell people that McnoFarp is the place to go for lunch.
I asked for "Trump getting his ass kicked by the Buffalo Bills," and AI did the expected shit job.
AI has no idea what Gumby looks like. I asked for "gumby rides the wild surf" and got something more like "animatronic jalapeno pepper rides out a lame wave."
AI isn't taking over. At least not in any known language.
For a while it showed only some sort of white table (picture cheap IKEA) and then it decided the table needed a plant. I still never got a green sphere and a red cube and a white table.
It fucking wanted a red sphere and goddamn if it would let me tell it not to draw one!
I applied a "Disney 3D character" style just for shits.
I got a chrome sphere with some vaguely green munchkin-lookin' thing on it.
Don't let this shit run your house or your car.
I think my yogurt review just confused the fuck out of them, so they just pushed every reject button they could reach.
I didn't even call them pinheaded shitlickers.
Ok, if it was the Angel Moroni, why aren't they called Morons?
I've been asking this for over fifty years and still haven't gotten a decent response.
Perhaps artificial stupidity will get me a response where human stupidity has failed.
#artificialstupidity
@davidgerard among others, have long been warning that the #GenAI giants are running out of content to consume for “training”, and they're getting desperate for more.
No surprise at all: #SamAltman, being the kind of person who's so contemptuous of human #culture they think an #LLM can do it, proposes the robots will just generate more training data themselves! What could go wrong?
“Prof. Sammut says generative AI systems have big limitations because chatbots lack critical thinking.
““These systems are based on doing pattern matching, they are very good at that, but they can’t do any sort of logical sequential reasoning.”
“A chatbot can only tell you 1+1=2 because someone told it so, not because it learned how to do arithmetic.”
If I had asked Verizon for assistance, their shit-ass script readers would have made me go up and down those stairs at least twice to physically unplug the router or something. And I would have told them to go eat goatshit as I had a heart attack.
During this adventure I did set up a quick port forward so I can set up a minimal Apache installation on one of the Apples and share even more cat pictures.
Seriously, if you meet anybody who worked on the Amazon search engine, dip them upside-down in a large vat of cow piss.
Best of ChatGPT
Der Versuch, #ChatGPT als Hilfe zum Reimen zu nehmen, endete im Desaster.
Aber lest selbst:
As if you needed more insight on how fucked-up Amazon can be.
The "set purchase reminder" for out-of-stock items doesn't work the way you'd expect.
If something's out of stock, you'd think perhaps this function works like "send me a message when it's in stock so I can buy it."
You would be fucked up in the head.
It is solely a reminder to YOU to buy something. It in no way has anything to do with whether you CAN buy it. There is no such Amazon function.
Whoever it was that posted she was looking for a job now that Amazon is doing their back-in-the-office shit, I really hope she didn't work on this search engine. Like, at all.