@SurpriZe@lemm.ee to Asklemmy@lemmy.ml • 7 months agoWhat model do you use in your GPT4all?message-square6fedilinkarrow-up127arrow-down16file-text
arrow-up121arrow-down1message-squareWhat model do you use in your GPT4all?@SurpriZe@lemm.ee to Asklemmy@lemmy.ml • 7 months agomessage-square6fedilinkfile-text
Curious about what model is best to use on my RTX 3080 + Ryzen 5 3600 since I’ve just found out about this.
minus-square@geneva_convenience@lemmy.mllinkfedilink2•7 months agoLlama3.1 8b,the other versions are too big to run on gpu
Llama3.1 8b,the other versions are too big to run on gpu