Posted by jsheard 9/1/2025
We do not blame computer programs when they have bugs or make mistakes - we blame the human being who made them.
This has always been the case since we have created anything, dating back even tens of thousands of years. You absolutely cannot just unilaterally decide to change that now based on a whim.
Companies are responsible for the bad things they make; the things themselves are, by definition, blameless.
I mean, no, I don’t think some Google employee tuned the LLM to produce output like this, but it doesn’t matter. They are still responsible.
There literally isn't room for them to know everything about everyone when they're just asked about random people without consulting sources, and even when consulting sources it's still pretty easy for them to come in with extremely wrong priors. The world is very large.
You have to be very careful about these "on the edge" sorts of queries, it's where the hallucination will be maximized.
Not sure where you’re getting the 45Gb number.
Also, Google doesn’t use GPT-4 for summaries. They use a custom version of their Gemini model family.
Gemma 3 270M was trained on 6 trillion tokens but can be loaded into a few hundred million bytes of memory.
But yeah GPT-4 is certainly way bigger than 45GB.
But once again I am reminded, never make arguments based on information theory. Nobody understands it.