SuperiorOne

@SuperiorOne@lemmy.ml

This profile is from a federated server and may be incomplete. Browse more on the original instance.

SuperiorOne,

I’m actively using ollama with docker to run llama2:13b model. It’s generally works fine but heavy on resources as expected.

SuperiorOne,

I recommend Obsidian with community plugins. Application itself isn’t open-source but your content stored as markdown files.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #