What LLMs lack is the unspoken, unwritten knowledge of experience. It's getting better at emulating reasoning ("I see you mentioned X and Y, likely therefore you must be Z") and inferring things based on Reddit threads ("Others who did X and Y also did Z, would you like me to do that also?"), but the semblance of comprehension is still just out of reach for anything that isn't likely to have been scraped from the web or written literature. It's never going to give you the knowing look of someone who's been where you are before. It's never going to think to itself, sure I could give this human the answer, but I know some lessons are only learned by experience. It can tell where you're going only if someone else has written about going there, but it cannot tell where you're going because it has ever been there itself.
It's like thinking you know how to drive because you read every driving course, the laws around driving, and all the car manuals ever published, despite never having been behind a wheel.
Artificial knowledge, I suppose, and maybe artificial intelligence, whatever that might mean, but not so much artificial wisdom