I’ve had 64GB only a month or so and have not even broken 32GB before today. And now I find out this isn’t even real, it’s including cached RAM, so it sounds like I’m gonna need to get into LLMs! Shame it’s on a laptop with integrated graphics.
Also u might wanna check out openwebui its a great frontend for llms. U can plug it into a bunch of services including remote api endpoints (I use open router for larger models), image gen, tools, searxng, etc. It also allows u to generate ur own api keys so its real easy to integrate any agents u use it to generate with other client services.
Welp I have 64gb of ram and I only ever use that much running llms.
I’ve had 64GB only a month or so and have not even broken 32GB before today. And now I find out this isn’t even real, it’s including cached RAM, so it sounds like I’m gonna need to get into LLMs! Shame it’s on a laptop with integrated graphics.
I might be surprised with the performance u can get. Mind u a GPU would defiantly help.
What would you suggest as a starting point for someone who has never run an LLM before?
Ollmama
I’ll have a look, thanks!
Also u might wanna check out openwebui its a great frontend for llms. U can plug it into a bunch of services including remote api endpoints (I use open router for larger models), image gen, tools, searxng, etc. It also allows u to generate ur own api keys so its real easy to integrate any agents u use it to generate with other client services.