Over the last couple of years I've been making an effort to move to as many open source solutions as I can. I like the idea of owning all of my data and storing it on my own machines.
I've stopped storing all my photos on iCloud and my files on Dropbox, and instead I'm storing them all on a Synology NAS that sits in my office. I've moved from Notion to an open source alternative called Affine. I'm using Home Assistant to manage my smart home devices and automations instead of something like Google Home.
The list could go on for a while, and maybe it will in another post. But I wanted to write this post to try and help anyone that's interested in self-hosting get started.
What Do You Need?
All you need to get started is a computer and an internet connection!
You can download open source software onto your computer and use it right there. This generally isn't the most practical way of hosting open source applications, however, because then you're only able to access the application on that machine. Or if you make it accessible network-wide, you need to make sure your machine is always up and running.
I would recommend getting a Mini Linux PC (not necessarily this exact model) and hosting all your applications on that. This would be a machine whose only purpose is self-hosted applications.
Setting Up Your Homelab
Heads up! Much of this process will be done in the terminal. They're not super complex commands, but just giving you a fair warning before you continue.
Once you have a PC that you want to run your homelab, you can get started setting things up. The first thing you'll need to do is install Docker. Docker is an application that lets you create isolated containers that your applications can run in. This is where pretty much all of your self-hosted applications will run.
If you're looking for a more in-depth explanation of Docker, NetworkChuck does a great job.
Installing Your First Application
Once Docker is installed you can install your first application. In this example we're going to install AdGuard, but most applications will follow the same steps.
The first thing you'll need to do is create the directory (folder) where the application will live. I like to create a directory for all my applications and then create a directory for each individual application inside that.
bash
Next, you'll need to create a docker-compose.yml for the application you're installing. You can generally find an example of this on the application's website or GitHub repository.
docker-compose.yml
Then you're ready to create the container. If you're not logged in as the root user, you may need to run the command using sudo.
bash
You should now be able to view your application's UI in your browser by entering your device's local IP address and adding the port specified in the docker-compose.yml. In my case, I can do so by visiting 192.168.86.57:3000.
External Access
If you would like to be able to access the application outside of your home network, you will need to give it its own domain. The easiest way to do that is by setting up Cloudflare Tunnels. I might do a separate post about setting up Cloudflare Tunnels at some point, but in the meantime, here's another video from NetworkChuck.
I would just recommend doing one thing differently from the video above. Instead of running the Docker command that Cloudflare provides to start your tunnel, I would add a service to the docker-compose.yml file we created earlier. This service will be in charge of starting the tunnel.
I prefer this method because we can set it up so that it always restarts anytime the machine gets restarted for any reason.
We just need to grab the value after --token in the command that Cloudflare gives us.
bash
And then we can update the docker-compose.yml so it includes our new tunnel.
yaml
Then we can run docker compose up -d to start the tunnel.
Once you finish setting up the Cloudflare tunnel you should be able to access your self-hosted application from anywhere!