kieppie

New Member
Joined
Oct 3, 2021
Messages
4
Reaction score
1
Credits
85
Hi folks,
(misplaced my earlier login, so starting anew)

Been using various flavours of Linux for ages, primarily server-side, but coming back to the desktop for a primary workhorse after a significant hiatus.

Rolling with Fedora (34) this time, as I find it better suited to BaU work more than 'fun', which is what I need.

Something that I've noticed is that there doesn't seem to be a good backup mechanism baked into the OS, and this is quite the departure from the W10 mechanisms provided through 'File History' & OneDrive. Though I'm not a big fan of pushing sensitive data to a cloud I don't own, manage or control, I've experienced first-hand how this has pulled critical projects form the fire, and the benefits in the way this facilitates something of a seamless experience across desktops is not trivial.

My setups are:
  • Desktops - Fedora, Ubuntu, etc; each for different purposes; 90% of endpoints are W10p64 desktops, because that what users/clients/family use
  • Servers - baremetal, VM/container hosts & cloud
  • IoT/embedded, mobiles
  • NAS: FreeNAS/TrueNAS, Synology (paired redundency) & offsite backup; running ZFS & btrfs RAID-5's, making daily snapshots so that I can roll back to an arbitrary day in the last month/year. These hosts also pull by online data via rclone.
  • Simple but segmented network with VPN's; no VLAN's or AD/DS
What is really valuable is the way that W10 naively integrates this: users' data backs up to the NAS on the smb://home share, and they can simply right-click on a file/dir of their choice, which correspond to the daily snapshots & "Recover a previous version" with little or no help or intervention on my part.
Because these are server-side generated snapshots, these same snapshots are available to my various POSIX machines over protocol of choice - NFS, iSCSI, SSHFS, etc - but this functionality is not native to the OS AFAICT.

Historically I've managed backups via rsync/clone - typically pulling data (starting with /home & /etc) so that I don't have to install or configure an agent on every client/endpoint - but this is inelegant, and does not effectively use the resources at disposal and introduces complexity that could otherwise be avoided.

This is my ask: what is the more elegant & efficient backup solution?

But I have a few considerations I'm taking into account:
  • needs to be Open Source & (near-)native: in the stable repo's and/or in Snap/Flatpak stores.
  • Support for FS-based snapshots (btrfs, ZFS, LVM) as well as backing/syncing up that (incremental) data to the network. (if/when my disk dies, my backups die with it unless archived elsewhere)
  • needs to be active/current - i.e. git repo still a project under active development/maintenance/activity in the last ~year, not stale
  • needs a usable/functional GUI - desktop but web or TUI/ng preferable. I've set this as an arbitrary bar for where I invest time & effort, as I've found this to be a fair indicator of a user-interacting app stack; especially if/when I don't want to do everything for everyone.
Some of of the solutions I've found may be OK fits, but have some shortcomings:
  • rclone(baseline reference)
    • preferred over rsnapshot & rdiff-backup
  • Duplicati (solid choice, but no FS snapshot)
  • Snapper (fair candidate)
  • Back In Time (stale?)
  • TimeShift (doesn't seem to support networking)
  • DejaDup(not something I would consider 'elegant')
  • Bacula ('open core', which just sits weird with me...)
  • Kopia (intriguing...)
  • BackupPC (no FS snapshotting AFAICT)
I know this is a tall order, but if I'm going to invest time & effort into adopting a strategy or solution more widely, I'd prefer it to meet my needs, so that ongoing deployment & use is somewhat uniform, regular & predictable and not needing to contend with numerous exceptions & edge-cases.

I'd appreciate folks insights & comments re this, please.
 



Members online


Top