It's not hard to understand. People already trust the output of LLMs way too much because it sounds reasonable. On further inspection often it turns out to be bullshit. So LLMs increase the level of bullshit compared to the input data. Repeat a few times and the problem becomes more and more obvious.
To be fair there are countless fads that went away again. Spicy autocomplete might very well be one of them. Not every invention has had the success of the automobile or the computers.
Installing Arch for the first time taught me a lot about how my system works, since you have to choose all the parts that make up your system yourself.