@petsoi@discuss.tchncs.de to Linux@lemmy.ml • 4 months agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square10fedilinkarrow-up164arrow-down112
arrow-up152arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.org@petsoi@discuss.tchncs.de to Linux@lemmy.ml • 4 months agomessage-square10fedilink
minus-square@MIXEDUNIVERS@discuss.tchncs.delinkfedilinkDeutsch1•4 months agoI did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model.
minus-square@lelgenio@lemmy.mllinkfedilink3•4 months agoollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now
I did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model.
ollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now