c10l,

On macOS I’ve been using Ollama. It’s very easy to setup, can run as a service and expose an API.

You can talk to it directly from the CLI (ollama run ) or via applications and plugins (like continue.dev ) that consume the API.

It can run on Linux but I haven’t personally tried it.

ollama.ai

  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #