We have all seen AI-based searches available on the web like Copilot, Perplexity, DuckAssist etc, which scour the web for information, present them in a summarized form, and also cite sources in support of the summary.
But how do they know which sources are legitimate and which are simple BS ? Do they exercise judgement while crawling, or do they have some kind of filter list around the “trustworthyness” of various web sources ?
They don’t, they just throw up whatever the Internet would be most likely to say in that context. That’s why they are full of shit.