I’ve heard good things about H2O AI if you want to self host and tweak the model by uploading documents of your own (so that you get answers based on your dataset). I’m not sure how difficult it is. Maybe someone more knowledgeable will chime in.
I’m using nebula to remotely access the raspberry pi in my home network and it mostly just works. The dual setup for nextcloud might be a bit more tricky, at least if you want to use HTTPS. You’ll probably have to set up a reverse proxy in Nginx for at least one of the routes, since they need different certificates (although since Nebula already authenticates and encrypts your traffic, HTTPS is probably not necessary there).
If low on hw then look into petals or the kobold horde frameworks. Both share models in a p2p fashion afaik.
Petals at least, lets you create private networks, so you could host some of a model on your 24/7 server, some on your laptop CPU and the rest on your laptop GPU - as an example.
I haven’t looked into specific apps, but I have been wanting to try various trained models and figured just self hosting jupyterhub and getting models from hugging face would be a quick and flexible way to do it
I am using a normal desktop case with an external usb-c 8-bay JBOD drive enclosure from Mediasonic. I’m using mdadm to combine the drives with RAID-6. I know I’m not getting the performance that I could with native SATA, but it can still saturate my 1Gbps network, so it’s good enough for serving video, audio, and some other web-based apps.
selfhosted
Top
This magazine is from a federated server and may be incomplete. Browse more on the original instance.