DontNoodles,

I’ve heard good things about H2O AI if you want to self host and tweak the model by uploading documents of your own (so that you get answers based on your dataset). I’m not sure how difficult it is. Maybe someone more knowledgeable will chime in.

Sims,

If low on hw then look into petals or the kobold horde frameworks. Both share models in a p2p fashion afaik.

Petals at least, lets you create private networks, so you could host some of a model on your 24/7 server, some on your laptop CPU and the rest on your laptop GPU - as an example.

Haven’t tried tho, so good luck ;)

Aties,

I haven’t looked into specific apps, but I have been wanting to try various trained models and figured just self hosting jupyterhub and getting models from hugging face would be a quick and flexible way to do it

c10l,

It’s pretty easy with Ollama. Install it, then ollama run mistral-7b (or another model, there’s a few available ootb). ollama.ai

Another option is Llamafile. github.com/Mozilla-Ocho/llamafile

jvrava9,
@jvrava9@lemmy.dbzer0.com avatar

Maybe Serge would fit your use case.

hazeebabee,

Sounds like a really cool project, sadly i dont have much knowledge to contribute. Still, what kind of issues have you run into? Any specific errors or problems?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • selfhosted@lemmy.world
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #