Recently, users have encountered severe hallucination issues while using Gemini 3 Pro for document processing. A user generated data mining review questions using NotebookLM, formatted them into a PDF with Typora, and then input them into Gemini 3 Pro for error checking and evaluation. However, the system incorrectly reported non-existent issues such as formatting errors, missing multiple-choice answers, and LaTeX syntax errors. Even when using the original markdown input, Gemini 3 Pro still made errors in formula recognition. This phenomenon reflects the limitations of AI models in professional document processing, providing valuable insights for AI developers, researchers, and technology enthusiasts, and sparking in-depth discussions about AI reliability.
Original Link:Linux.do
最新评论
照片令人惊艳。万分感谢 温暖。
氛围绝佳。由衷感谢 感受。 你的博客让人一口气读完。敬意 真诚。
实用的 杂志! 越来越好!
又到年底了,真快!
研究你的文章, 我体会到美好的心情。
感谢激励。由衷感谢
好久没见过, 如此温暖又有信息量的博客。敬意。
很稀有, 这么鲜明的文字。谢谢。