It doesn't need to be reliable here to have the described effect. Instead, all it needs to do is point users in the right direction, which LLMs are usually quite good at. I often describe to one something that feels like it should exist, and then it can come back with the specific obscure name for exactly that thing. When it doesn't have an accurate answer, all I've lost is a few minutes. It just needs to give a vaguely useful direction most of the time.