SurpriZe@lemm.ee to Asklemmy@lemmy.ml · 16 days agoWhat model do you use in your GPT4all?message-squaremessage-square2fedilinkarrow-up12arrow-down10file-text
arrow-up12arrow-down1message-squareWhat model do you use in your GPT4all?SurpriZe@lemm.ee to Asklemmy@lemmy.ml · 16 days agomessage-square2fedilinkfile-text
Curious about what model is best to use on my RTX 3080 + Ryzen 5 3600 since I’ve just found out about this.
minus-squaregeneva_convenience@lemmy.mllinkfedilinkarrow-up1·15 days agoLlama3.1 8b,the other versions are too big to run on gpu
Llama3.1 8b,the other versions are too big to run on gpu