What’s bigger than a Giant. Ahh, it’s a South American.
- 0 Posts
- 4 Comments
Joined 7 months ago
Cake day: June 30th, 2025
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
The bottom half doesn’t really reflect the reality for most either. Maybe it’s accurate for a majority of Londoners or other big industrial city, but in the 1800s only about half of the population lived in cities or towns (rapidly urbanizing during that century), and of those even fewer were in big cities working factories. Many more were farmers, or smiths, or bakers, or cobblers. And of course recognizing the unpaid labor of women at home is important too.
“I’ll decorate tomorrow” except the tree is vibrating on the couch as it’s needles fall off.




Humans will anthropomorphize damn near anything. We’ll say shit like “hydrogen atoms want to be with oxygen so bad they get super excited and move around a lot when they get to bond”. I don’t think characterizing the language output of an LLM using terms that describe how people speak is a bad thing.
“Hallucination” on the other hand is not even close to describing the “incorrect” bullshit that comes out of LLMs as opposed to the “correct” bullshit. The source of using “hallucination” to describe the output of deep neural networks kind of started with these early image generators. Everything it output was a hallucination, but eventually these networks got so believable that sometimes they could output realistic, and even sometimes factually accurate, content. So the people who wanted these neural nets to be AI would start to only call the bad and unbelievable and false outputs as hallucinations. It’s not just anthropomorphizing it, but implying that it actually does something like thinking and has a state of mind.