You must log in or # to comment.
This article is brutally disingenuous, but this line took the cake:
I’m not using an LLM to work out 2+2 and find that sometimes it tells me it’s 4 … I’m using an LLM to write some code (SQL) that says something like:
SELECT col_1 + col_2 FROM src_table;and then the RDBMS does the calculation. No hallucinations. Either the code is right, or it’s wrong. And that’s concretely testable and verifiable.
Mother fucker that is exactly
2+2=4and yes the LLMs will fuck that up, too. Each time you use this technology it hurts every one of us, some more than others. To use it for something so basic is callous, wasteful, and embarrassing.Fuck this guy.


