I encountered a hallucination with Deep Research yesterday. I was using it to do a TAM research and it hallucinated and said that a particular startup has been acquired by another non-existent company. It was a throwaway line in a larger table of data. You can see how a verifiably wrong fact like this casts doubt on the reliability of the entire report. I thought a huge step up of o3 and Deep Research is that you don’t get hallucinations anymore. Clearly not.
I think LLMs will still be useful to get a high level view about topics, but there is still no substitute for actual human intelligence when it comes to deep dives and scrolling through pages and thinking with your own head.