Intro
The inception of this journey happened when I wanted to utilize our decade old PC. It was a Dell Inspiron 660s. A small form factor PC which was quite capable back in 2013. After spending some time on Reddit and watching a couple of videos on YouTube, I came across this interesting idea of Homelab. It basically means you host a server at home that locally runs different services you want to use. The benefits are two folds.
You have the control of your data for maximum privacy.
- The last year or so, Google and Samsung had introduced a tool called Magic Eraser. This is very handy if you want to remove an unwanted object from you picture. Turns out, if you cover your face with your hands and then erase your hands using the tool. The AI will fix the photo eerily close to the real deal. At this point, you ask yourself is the claim “We do not train our models on your photos” really true?Now that your data does not leave your server, you are in charge of everything. This is truly powerful and the most important benefit of this setup.
No more running out of space issues.
- Remember that time you took a picture and got a warning that you are running low on space, 👀 iPhone users. That is going to be a thing of the past. Theoretically, you can have unlimited storage in this setup. Realistically, think of how many pictures/videos you can store with 1 TB of space.
The feel good factor.
- Once everything is setup and you cancel your, spoiler alert, Google Photos subscription, you get this happy feeling that you did something cool and get the bragging rights to talk about self-hosting.
However, there are some points to consider before you begin with this journey as well.
If you want your services up and running 24/7, then you have to leave your server on 24/7.
- Energy consumption and cost of running the server is a real consideration here. Later, I will also talk about how to have an efficient system.
You have to be comfortable with the shell environment.
- You could have GUI but that requires more resources.
Backup > Backup > Backup
- Now that you are in control of your data that also brings the responsibility of preserving that data in the long run. A disk failure can happen anytime without any warning. So make sure to backup your data regularly. This is still better than forgetting a recurring payment for Google One or iCloud and loosing all those photos forever.
You have to invest your time and energy.
- With the advent of LLMs it definitely gets easier to find the correct information, but at the end you have to find the time to execute these things, fix bugs, pull the latest updates and general maintenance of your infrastructure. IMHO, it is not a set-it-and-forget-it kind of a project. You have to be involved. This will be your baby now. But, I see this as a positive challenge.
As you saw, the cons outweigh the pros 4:3. I leave the decision to the reader if they want to make the transition or not. With that out of the way, let’s talk tech now.
Coming up with a name
I named this server, Rohini, after one of the stars (nakshatras) found in the text of ancient Indian astronomy. Assigning a name to the server makes it easier for future references and helps in maintaining the infrastructure.
Choosing the OS
There were couple of options here. Ubuntu Server, OracleBox on another OS, ESXi or Proxmox. I chose Proxmox. It provides kernel level virtualization and is community driven but is also used in enterprise environment. The documentation is really rich and you can find what you are looking for fairly easily. I went into this with no prior experience with virtualization software but was able to manage fairly well. That is the advantage of community driven development. You get a lot of support.
Hardware
Here’s a list of what I am running Proxmox on.
- Intel i3-2120 @3.4 GHz
- 4 GB DDR3 single memory
- 500 GB Crucial BX500 SSD,
- 500 GB decade old Seagate HDD
- mATX Dell proprietary Motherboard and 120W PSU.
There’s a second RAM slot but for some reason I never got it to work. As soon as I plugged another RAM in that slot. I could not POST and got a beep-beep-beep sounds from the MB.
CD/DVD Writer
Remember the good old days when we use to burn our favorite songs on CDs. I had to resort to this to install Proxmox on the system since the BIOS is soo old that I could boot from a USB and the BIOS did not want to update and did not support UEFI for some stupid proprietary reason. Eventually, I got Proxmox up and running. The plan was simple, use the HDD for backups and SSD for live storage.
Running services
All the services that I have running at the moment are running in their individual LXC containers. The first service that I installed was,
Tailscale
It is a wireguard based VPN technology that relies on their relay servers to form a mesh network for your devices. This was of utmost important since this server will be hosted at my parents place in India and I live in Germany. I should be able to connect to my server even when I am 6000 kms away. And in this regard Tailscale works like magic. Best part is you do not have to forward any port on your router. This is nice because I am still a little skeptical about opening a port on my router for the public internet since I lack proper knowledge and experience.
Tailscale enables end-to-end encrypted, point-to-point connections between devices without routing traffic through a central VPN server, which reduces latency and enhances security. The relay servers are just used to authenticate the devices in the tailnet by exchanging public-private keypair.
Immich
Immich is a self-hosted drop in replacement of Google Photos. The transition was pretty smooth. I got the service up and running using the Proxmox Helper Scripts and I have used this method for the other services as well. Note: Keep in mind that this service has seen a change of administration and the outlook, at the moment, is that it is not what it used to be.
Essentially, Immich is running inside a docker container which is running inside a LXC container. I know this situation is like the Russian doll and I would fix this in the future. Getting the photos out of Google Photos was time-consuming but easy. There is this service called Google Takeout which you can use to download all the data that you have given to Google. You won’t believe how much of that can amount to. I was already on the Google One 100 GB plan and after requesting the data it took Google some hours to generate the download links. Once you have the data you can use Immich-go another open-source software to manage your transfer from Google Photos to Immich. The setup was self explanatory and worked like a charm. It took me 4.5 hrs on 250 Mbps UP/DOWN speed to upload the data to the SSD.
Note: I could have done this much better by copying the data to an external drive and the offload the same to the SSD on the server. Finally, I could have told Immich to just use this library as an external source. But I learnt this later. Soo moving on.
Today, Immich is working really great. Now, my photos directly get backed up to the server from my mobile client. I take periodic backups to ensure I do not loose anything in case of a failure. The experience was soo great that I got my father and my wife on this service as well.
Nextcloud
Now, Nextcloud, similar to immich is a self hosted replacement for Google Drive. And again, I used the Proxmox Helper Script to install the service. I don’t use this service as much I would like it too since I have not imported all of my data from Google Drive to Nextcloud, yet. Something for the future.
Homepage
Homepage is a simple dashboard for your services so that you can find all your services in one place. It is highly configurable and can also talk to other services using API calls.
Pihole
Pihole is my ad-blocker and DNS resolver of choice. This is crucial for your infrastructure and who does not like network wide ad-blocking. The real benefit is when you configure Pihole to be the Global DNS resolver in your tailnet so whenever you are connected to your tailnet you can still block those pesky ads wherever you are.
Caddy
Caddy is a great reverse-proxy written in Go that automatically obtains and renews the TLS certificates for your sites. It is recommended for its ease of use and variety of plugins. This became a crucial service when I wanted to use my purchased domain for reverse-proxying the services and obtain TLS certificate with DNS-ACME challenge via Hetzner API. This can be a blog topic of its own. Long story short, after setting this up, I could reach, for e.g., Nextcloud using cloud.vaibhavnath.in
Outro
This was just an introduction of an even bigger project. I will continue this series where I extend my Homelab with 2 more nodes and one NAS. I will also write about automatic backups of entire LXC using BorgBackup. Getting a Hetzner Storage Box for achieving true 3-2-1 backup policy. So stay tuned.