Rendered at 05:52:52 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
MarkusQ 2 hours ago [-]
It's nice to see someone else that recognizes that the Emperor is naked. I find it somewhat disturbing how many people seem to fall for the Eliza-effect illusion that LLMs are thinking.
redlewel 5 hours ago [-]
horrible design on the website please just give me a block of text to read
> One computer scientist speculated that his LLM had attained sentience.
> How did he reach that conclusion? Basically, he asked “Are you conscious?”, the machine responded “Yes”, and that was that.
Oh, come on now. This is referring to Blake Lemoine, and while I doubt his conclusions, he wasn't being as simplistic as all that. He's not completely stupid.
simianwords 48 minutes ago [-]
This website is not worth your time.
> LLMs operate in the plane of words, not in the world of physical phenomena that science investigates. They don’t reason, synthesize evidence, or draw upon the previous literature. They can generate text that looks like a paper but mistaking this for science is a cargo-cult fallacy.
This is clearly wrong
damnitbuilds 10 hours ago [-]
A 10,000 word website about bullshit machines.
See what they did there ?
wewewedxfgdf 8 hours ago [-]
Clearly just an anti AI rant packaged up in a fancy suit.
Feb 2025 https://news.ycombinator.com/item?id=42989320
[0]https://callingbullshit.org/
> How did he reach that conclusion? Basically, he asked “Are you conscious?”, the machine responded “Yes”, and that was that.
Oh, come on now. This is referring to Blake Lemoine, and while I doubt his conclusions, he wasn't being as simplistic as all that. He's not completely stupid.
> LLMs operate in the plane of words, not in the world of physical phenomena that science investigates. They don’t reason, synthesize evidence, or draw upon the previous literature. They can generate text that looks like a paper but mistaking this for science is a cargo-cult fallacy.
This is clearly wrong
See what they did there ?