@joojmachine@lemmy.ml to Linux@lemmy.mlEnglish • 1 year agoAlpaca: an ollama client to easily interact with an LLM locally or remotelyflathub.orgexternal-linkmessage-square7fedilinkarrow-up198arrow-down16cross-posted to: linux@lemmy.ml
arrow-up192arrow-down1external-linkAlpaca: an ollama client to easily interact with an LLM locally or remotelyflathub.org@joojmachine@lemmy.ml to Linux@lemmy.mlEnglish • 1 year agomessage-square7fedilinkcross-posted to: linux@lemmy.ml
Is it better than Ollama and if so how?