AI-penned Microsoft Travel article recommends food bank as if it were a famous restaurant.
See full article...
See full article...
In other news, the article was rapidly ingested by other LLM web scrapers and the results integrated into their models.
idk I've been watching a bunch of old Onion TV skits from 5-10 years ago, and there is something so damn good about the way they deliver satire.I don't know about other forms of journalism, but generative AI is certainly putting creators of satire out of business!
Life is already difficult enough. Consider going into it on an empty stomach.
Have you tried it?AI-penned Microsoft Travel article recommends food bank as if it were a famous restaurant.
Which is the problem with using LLMs to generate these kinds of articles. They don't interpret, they just rearrange text so that it the scorer is maximized regardless of context.From the Ottawa Food Bank's website:
"Everyday we see how hunger affects men, women and children and how it can become a barrier to success. People who come to us have families and jobs to keep with bills to pay. Life is challenging enough. Imagine facing it on an empty stomach."
This feels more like a mistake in reinterpreting their own words rather than a callous misunderstanding of their mission.
Do LLM’s have the concept of mission already?From the Ottawa Food Bank's website:
"Everyday we see how hunger affects men, women and children and how it can become a barrier to success. People who come to us have families and jobs to keep with bills to pay. Life is challenging enough. Imagine facing it on an empty stomach."
This feels more like a mistake in reinterpreting their own words rather than a callous misunderstanding of their mission.
Yeah, I once tried looking up a peripheral Letterkenny character and Google turned up an article that was grammatically sound, but drooling gibberish in every other way.I am pretty sure I encountered my first LLM-generated page the other day: Sentence after sentence of the same single piece of information, each one slightly re-arranged slightly different, the whole thing structured to visually resemble a flowing article, and every single phrase superficially related to what I had websearched, yet not at all what any even a 1/8 intelligent entity would consider related to my search query. So the utillity of websearch dies, I guess. :-/
Microsoft was not immediately available for comment by press time.
No.Do LLM’s have the concept of mission already?
They‘d know millions of phrases related to the word mission, but do they understand what mission means?
Datalore's Razor?We need a new formulation of Hanlon's Razor:
"Never attribute to malice that which can adequately be explained by AI"
They do not. These models don’t know or understand anything in any meaningful way.Do LLM’s have the concept of mission already?
They‘d know millions of phrases related to the word mission, but do they understand what mission means?
Consider going into it on an empty stomach.At least it didn't recommend the sperm bank.
Which goes to show that AI's aren't exactly a font of insight or knowledge. The reliability issue isn't one that's going to be an easy fix, either, since exactly what will come out of the algorithms of an AI in its responses can't be predicted.From the Ottawa Food Bank's website:
"Everyday we see how hunger affects men, women and children and how it can become a barrier to success. People who come to us have families and jobs to keep with bills to pay. Life is challenging enough. Imagine facing it on an empty stomach."
This feels more like a mistake in reinterpreting their own words rather than a callous misunderstanding of their mission.