localllama
LocalLLaMA planish 1y ago 100%

How do I get LLaMA going on a GPU?

Everyone is so thrilled with llama.cpp, but I want to do GPU accelerated text generation and interactive writing. What's the state of the art here? Will KoboldAI now download LLaMA for me?

3
0
Comments 0