wuphysics87 to Privacy@lemmy.ml • 6 months agoCan you trust locally run LLMs?message-square17fedilinkarrow-up153arrow-down16file-text
arrow-up147arrow-down1message-squareCan you trust locally run LLMs?wuphysics87 to Privacy@lemmy.ml • 6 months agomessage-square17fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-square@stink@lemmygrad.mllinkfedilinkEnglish1•6 months agoIt’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.
minus-square@stink@lemmygrad.mllinkfedilinkEnglish1•6 months agoAt parents for the week but IIRC I had some sort of incompatible extension with a model I downloaded off huggingface
It’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.
deleted by creator
At parents for the week but IIRC I had some sort of incompatible extension with a model I downloaded off huggingface