• Ceedoestrees@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    edit-2
    2 months ago

    I think they’re saying that the kind of people who take LLM generated content as fact are the kind of people who don’t know how to look up information in the first place. Blaming the LLM for it is like blaming a search engine for showing bad results.

    Of course LLMs make stuff up, they are machines that make stuff up.

    Sort of an aside, but doctors, lawyers, judges and researchers make shit up all the time. A professional designation doesn’t make someone infallible or even smart. People should question everything they read, regardless of the source.