I had the same problem as OP. My solution was to port forward to my server but then block connections from all IP addresses accept from my work, which I added to an allowlist.
It’s working well so far, but I think the Cloudflare tunnel is the better option.
I know it’s not technically “self” hosted but I’d get a cheap yearly VPS somewhere and run a webserver off of that.For me its worth the peace of mind to keep my network a temple instead of a bus terminal. I paid $13 usd for the year for mine
I believe Oracle is still offering to slice off a bit of compute for free that should accomplish OP’s goal. I’ve used it to test a Jellyfin host among other things and for the price it can’t be beat!
I’ve been running a script every 60 seconds for 2 months now as a cron job and it still hasn’t been able to create a VM in their US datacenter. I just have a log full of “insufficient host capacity” errors.
A VPS makes sense insofar as keeping things thoroughly isolated from my own systems, but the overhead of maintaining a box that’s directly connected to the Internet like that isn’t something I’m keen on and I’m not convinced I’d have the expertise to do it right from the outset.
The Oracle Cloud VPS only has SSH key authentication enabled by default. You can also set it to only allow SSH from your home IP in the virtual firewall before the machine is ever spun up.
Their current free ARM offering is 1 machine with 4-cores and 24gb RAM for life. You can also add another 2 AMD machines with 1-core and 1gb RAM and still be in their free-tier.
If you’re going to set it up and take advantage of the ARM machine, make sure you pick a home location for your account that has multiple availability zones. San Fran right now only has 1 zone, so if the shared ARM instances are all used up, you’ll have to wait a few days and try again. Phoenix I think has 3, so you can try with another zone right away.
I guess I’m extremely paranoid then, my home IP doesn’t change much and I just expose the port only to it from Oracle’s site. I rarely touch mine though.
Changing port is security by obscurity and it doesn’t take much time for botnets to scan all of IPV4 space on all ports. See for example the ever updated list that’s available on Shodan.
Disable password login and use certificates as you’ve suggested already, add fail2ban to block random drive-bys, and you’re off to the races.
I just restrict SSH to an internal VPN IP on all my servers (ZeroTier). 100% impossible to even try logging into them unless you’ve managed to crack into my network first.
+1 for VPS, the ionos ones are $2/mo and have unlimited bandwidth at 400mbps. That’s basically the cost of electricity for a home server with orders of magnitude better reliability.
They’re not doing like proton and close basic stuff like IMAP and SMTP as a way to force you on the official apps
I especially love the feature where you can bounce emails based on domains, keywords or TLDs. My spam folder is finally empty. IMHO bounce back spam is much better, as the spammers get a response that the address is invalid and hopefully stop wasting their limited computing resources on that address.
Zoho is not open source, but proton is a “fake” open source that is mostly used for marketing: they opened only the UI, which communicates with a proprietary protocol to a proprietary server - useless. They also reject or ignore any pull request on GitHub.
i started with the mail basic (10 euro yearly for 10gb) but then because i switched from “secondary email that forwards to gmail” to “primary email that imports from gmail”, i had to move to the more expensive plan
I mean, that’s going to be a risk you take with any hosted service. I currently self-host my contacts and calendar, but I have no interest in hosting my own email again.
I don’t self host my email either. I got my registrar, DNS and email separate from each other so if any of them goes bad I can switch with minimum fuss.
But that makes it all the more important to be able to download all your mail from your provider.
Proton currently has two proprietary things you can use to download, a “bridge” PC app that pretends to speak IMAP, and a download tool. The bridge will be discontinued after they launch their propeietary PC mail app so that leaves just the proprietary download tool, which only does .eml. format.
That’s a very broad question that depends a lot on your usage. My needs may be different from yours.
I’m currently using Migadu because:
Unlimited domains, mailboxes, accounts and aliases for a flat fee. I’m managing accounts for myself as well as family and extended family members and it comes out much cheaper this way than services that ask $5-10/account.
Very nice management interface with all the bells and whistles but with reasonable defaults and easy to use.
The company is based in Switzerland and the mail hosted in EU (France).
Standard email service with everything you’d expect (the regular protocols, spam protection, webmail, full compatibility with clients etc.)
They’re not doing like proton and close basic stuff like IMAP and SMTP as a way to force you on the official apps
The reason Proton cannot do IMAP/SMTP is that they cannot read your emails which is required for both. That’s a feature, not a bug.
PM works with any app as long as the app implements their custom protocol for which there are at least two FOSS implementations as a reference.
proton is a “fake” open source that is mostly used for marketing: they opened only the UI, which communicates with a proprietary protocol to a proprietary server - useless
While I’d also prefer their back-end to be OSS, it’s not nearly as critical as the clients.
As a user, it doesn’t make a difference. I’m paying for an opaque service either way.
All the interesting stuff (E2EE, zero access storage) happen in the clients anyways. The BE is fairly uninteresting; it’s a mail server + zero-access encryption + Proton account handling. If you really wanted to build a mail service similar to Proton, you could build that yourself and probably would have to anyways.
i think instead the opposite. The backend is the real interesting part, and the only way that we can be sure that “they cannot read the emails” (they arrive in clear, saved with reversible encryption and they have a key for it - if you use their services to commit crimes they will collaborate with the law enforcement agencies like everyone else)
imap/smtp can be toggled with a warning, if that’s really their concern. As of now i have the feeling that’s instead blocked to keep users inside (no IMAP = no easy migration to somewhere else) or to limit usage (no SMTP = no sending mass email)
The backend is the real interesting part, and the only way that we can be sure that “they cannot read the emails”
While I’d still prefer it, OSS can’t really help with that because what’s really required here is remote attestation.
That is an unsolved problem to my knowledge; there is no way to know which software they’re actually running. Even if they published the source code, they could trivially apply a patch in their deployment that stores all incoming email somewhere and you’d be none the wiser.
Even if they published source code and could somehow prove to you that they’re running a version derived from it, you would still not be safe from surveillance as one could simply MITM all connections. See i.e. notes.valdikss.org.ru/jabber.ru-mitm/.
That’s likely one of the reasons they do everything they can to make PGP accessible to every user.
imap/smtp can be toggled with a warning, if that’s really their concern
It’s plain and simply not how their service works. They’d have to build most of their service a second time but unencrypted.
It’s like asking Signal to build in support for IRC; it does not make sense for them to do that in any way without malicious intent needed.
no IMAP = no easy migration to somewhere else
You have IMAP access via the bridge. That’s what it’s for.
Zoho and PM have two entirely different reasons for existence. If you don’t want E2EE (assuming the other sender is on PM) then by all means, use Zoho. And IMAP isn’t E2EE compatible in the slightest, what they’re charging for is the decryption bridge that makes it work with an IMAP client. They had to come up with that, it’s not just a switch you flip at PMs end that makes IMAP work.
Proton is in the process of removing their PC bridge in favor of a custom app. After they’re done you won’t be able to migrate your email away from their service anymore.
Which is ironic, when people are trying to flee from Google. Out of the fire and into the frying pan…
It’s in restricted beta currently, only available to a small category of users, and still lacking features. They say it will launch early 2024 but it looks more like mid-year to me (at best).
Point is, it won’t happen very soon, but it will happen
For YouTube, it’s probably not possible to not use its content, but you can try alternate front ends like Piped (and its wonderful Android client, LibreTube, if you’re on Android.)
For Gmail, not sure if this works for you but I set the vacation feature to reply to every email I receive notifying them of my new email. I switched to Vivaldi Webmail (Proton doesn’t let you use 3rd party clients w/o a subscription plan btw, I’d switched to Vivaldi first so not a major thing for me) but Skiff (paid) looks good, Kagi Search is planning an email service, Tutanota has an email service, and I guess you could self host. While you transition, use a client that lets you have a unified inbox (K9 works on Android) and just have both logged in.
The only thing i have done so far is use imap into thunderbird…. All this is valuable. I want to use a self hosted solution but at the moment its all a surface level thought. Thanks a lot for this comment!!
I’ll let folks with more security experience dive into your specific question, but another option is to host your website on something like Github pages (using a static website generator like Jekyll) and point Cloudflare at it. That way you don’t need anything pointed at your local network, get the uptime of Github, and still benefit from your own domain name.
That’s what I’m doing with my own blog and it’s been great. Github provides the service for free but if they ever charge for it I’ll just start hosting it locally.
That’s what I’m doing! I used it to make a “blog” of all the things I had to learn to switch to Linux for my home drives and daily gaming rig. Complete with copy buttons on the code blocks so I can do a complete reformat in minutes!
Or take github out of the equation and directly use cloudflare pages. It has its own pros and cons, but for a simple static blog it’ll be more than enough, and takes out the CNAME hassle.
I could never get the AIO setup to work well for some reason. It was also a couple versions behind it seemed.
I…uh…know it’s not popular on the fed, but I use the nextcloud snap package and it’s been rock solid. It’s always up -to-date and they have a backup/export feature too.
People talk a lot of smack on snap, but installed the nextcloud snap 5 years ago to check out nextcloud and see if I liked it. I did, and the snap was so easy that it stuck around for 5 years. I didn’t do anything except update the underlying OS. It is really well maintained.
I just migrated off of it to get a little more flexability, but I have nothing but good things to say about it.
I couldn’t make things easy for myself when I migrated, because I wanted to use postgres, while the snap uses mysql/mariadb and I wanted S3 storage instead of file system.
In the end I just pulled down all the user filed and exported the calendars and contacts manually, then imported them on the new instance.
There are some blog posts on migrating db types, but my install is very minimal and I just didn’t want the headache.
If you don’t want to change the database type, then you can just dump the db from the snap, backup the user file directory, then restore into the new database and rsync up all the files.
What do I use this for? Do I install it on my NAS or my gaming pc?
My best guess is this is a self hosted network storage for games and other computers run the games from there? Or do they download the game from there? Is it a way to store game saves? Does it have any use for emulators like yuzu?
Sorry for all the questions, I’m only asking because the software looks really interesting but I just can’t figure out its uses.
Seems like the intro clears some things: gamevau.lt/docs/introIt looks like you install the server component on your NAS/server etc and store your game files/binaries/installers there. Then you can download client applications and download from that location to install on your gaming PC or whatever
It’ll work fine. A NAS is just a PC. Try Unraid if you want a user friendly UI. It costs money but it’s only a one off payment for a lifetime license, and they have a free trial.
+1 for unRAID. I did the same when I got tired of Netflix increasing prices while dropping content. Also got annoyed with my cable because it’s expensive and good content is rare.
Bought a 12th gen i5 desktop on sale and 4 x 10Tb drives and installed unRAID on a USB key.
Easiest thing I’ve done in years and it’s 100x better than Netflix and 1000x better than cable.
That’s what I was thinking, but less… Fire hazard? I’ve seen some of those that are just crazy. Idk mostly need a board that can handle it. Idk just dreaming of a new project with spare stuff hanging around.
I would guess they’re a fire hazard because of the overclocking they do. They’re either a long term (heh) project and they’re immaculate, or they know they need to squeeze every bit of value and abuse the fuck out of those GPUs. I think you can tell if a rig is dangerous so you should be ok
Try MySQL instead of MariaDB. They have some performance tweaks in version 10 that aren’t present in MariaDB.
Also, tune your MySQL (or MariaDB) server. Make sure all tables use InnoDB. Enable the slow query log and analyze slow queries (there may be missing indices). If there’s a lot of unique queries, increase the query cache size.
The easy approach is to run MySQLTuner after the MySQL or MariaDB server has been up for at least a week, and go through its suggestions.
There shouldn’t be a significant difference in performance between PostgreSQL and MySQL/MariaDB if both have been optimized. Out-of-the-box config isn’t ideal for a production system.
You totally can, but since it will be on all day with 4 hdd look into wattages you want to live with. There are some small NUCs or Pi based NAS with low wattages. There is OpenMediaVault, FreeNAS/TrueNAS software to install
Note that there is some reliability drawback of spinning hard disks on and off repeatedly. maybe unintuitively HDDs that spin constantly can live much longer than those that spend 90% of their time spun down.
This might not be relevant if you use only SSDs, and might never affect you, but it should be mentioned.
You could totally turn on as needed, WakeOnLan is good for that. But typically when people run a NAS it is for streaming audio, video, file sync and backups and maybe docker running other services so the NAS is typically on 24/7 so it is available on demand. But it doean’t have to be 100% uptime if you don’t want it to be. For example I have two OpenMediaVaults one on a pi and one an old IomegaNAS. The pi is on always with an attached drive, and serves Samba Shares and DLNA/DAAP shares. Has docker running syncthing, CUPS print server, Trillium Notes, and homeassistant; so makes sense for it to be on all day, especially because my wife’s system backsup to it daily automatically. The converted Iomega NAS is mainly a backup machine sInce it is old and not as performant (only has 100 network speed. So that gets turned on to do a bulk backup and not much else.
Only complaints I have with Nextcloud are that it’s slow and updates suck over the web interface. But apart from that it has been reliable. I’m not running it through Docker. In fact, my installation is so old that the database tables still have an oc_ prefix.
You might want to try migrating your nextcloud instance to postgres instead of mysql/mariadb. Many people says they get some big performance boost. I’m going to try it myself next weekend to see if it’s true.
Mine is a snap install that started 3 years ago on virtual box and was ported over to proxmox. It has never broken, updates automatically, and generally seems to work just fine.
It doesn’t load instantly, but it doesn’t drag by any means.
selfhosted
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.