Pull Backup with Restic to a QNAP NAS S3 store
Harald Hoyer November 06, 2023 #restic #NixOS #backupAlthough restic
is a really nice backup solution, it lacks the possibility to pull the files to
backup from a remote server, like rsync does over ssh
, preventing any attacker to gain access to a secret.
This blog post shows some workarounds and does the backup to a QNAP NAS S3 store over an encrypted TLS connection.
To pull the files, rsync
is used. On the remote server a user backup
is created with .ssh/authorized_keys
:
restrict,command="/run/wrappers/bin/rrsync -ro /" ssh-rsa AAAA[…]<a public ssh-key>
Replace /run/wrappers/bin/rrsync
with the path to an s-bit wrapper only executable by user backup
,
which sets the capabilities cap_dac_read_search=+ep
and finally exec´s rrsync
. rrsync
will only allow
to read files, when called with -ro
.
This enables the trusted internal machine to rsync
all files from the remote machine.
The local mirror can now be backed up with restic
from the internal machine. No secret on the remote machine gives an
attacker access to anything.
If you don’t have the space for a local mirror, then you may use unpfs
,
which uses the 9p filesystem to mount the remote server. With ssh port forwarding
(also with a capability wrapper and restrict), this can be done encrypted.
The mount point can then be backed up with restic as usual.
This is way slower than using rsync over ssh,
but might be the only solution (although disk space is cheap nowadays).
The rrsync
solution for NixOS looks like this for the remote machine:
{ ... } : {
# …
users.users.backup = {
shell = pkgs.bash;
isNormalUser = true;
openssh.authorizedKeys.keys = [
"restrict,command=\"/run/wrappers/bin/rrsync -ro /\" ssh-rsa AAAAB[…]"
];
};
security.wrappers.rrsync = {
source = "${pkgs.rrsync.out}/bin/rrsync";
owner = "backup";
group = "users";
permissions = "u=rwx,g=,o=";
capabilities = "cap_dac_read_search=+ep";
};
# …
}
and the restic
solution for the local machine:
{ ... } : {
# …
services.restic.backups.xxx = {
repository = "s3:xxxxxxx.myqnapcloud.com:8081/backup";
environmentFile = "/var/lib/secrets/backup-s3";
passwordFile = "/var/lib/secrets/backup-pw";
timerConfig = {
OnCalendar = "daily";
FixedRandomDelay = true;
RandomizedDelaySec = "1h";
Persistent = true;
};
paths = [ "/home/backup/xxx" ];
pruneOpts = [
"-g host,paths"
"--keep-daily 7"
"--keep-weekly 4"
"--keep-monthly 3"
"--keep-yearly 1"
];
backupPrepareCommand = ''
HOME=/root ${pkgs.rsync}/bin/rsync -e "${pkgs.openssh}/bin/ssh" --no-specials --no-devices --numeric-ids --delete-after --partial -axz [email protected]:/{etc,var,home,root} /home/backup/xxx
'';
};
# …
}
For the above rsync, I was lazy and used /root/.ssh/config
to specify the ssh config for the remote machine, like the ssh key.
As you can see, I am using my QNAP NAS as an S3 store. This is possible with the QNAP
QuObjects app for free. You can get a LetsEncrypt cert for the QNAP,
enable virtual web servers on the QNAP and run the QuObjects server with the external xxxxxxx.myqnapcloud.com
name.
That way, the backup channel to the QNAP is encrypted, even if it’s all happening in your trusted home zone.