• Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    0
    ·
    10 days ago

    If you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.