Unlike large models such as Gemini or ChatGPT, where information is extracted from numerous web sources that may contain “hallucinations,” NotebookLM relies 100% on the sources you provide, such as PDFs, audio files, YouTube videos, Google Docs, or even articles. By working exclusively with your sources, the tolerance for hallucinations is very low.
knollimar 4 hours ago|
Huh don't most hallucinations come from the models internal knowledge and not the RAG?
burnerToBetOut 3 hours ago|
Please clarify the Google connection.
I'm guessing that it's an official Google-built product. [1]