jecabeda
jecabeda
@jecabeda

TIL: You can very easily run genAI models locally using ollama.ai

With it you can just run `ollama run codellama:13b` and you are good to go!

View original on Mastodon →