Share your backup scripts / software ideas and strategies



I use Timeshift and Pika-backup to backup everything to an external SSD.
 
I use Timeshift and Pika-backup to backup everything to an external SSD.
I used timeshift before, if I remember it correnctly, it stores the backup somewhere in /usr/share or whatnot, which sucks, when you have seperated root and home partitions. Also, if you try to rsync / I remember a loop is appearing. Is that still the case?
 
I use Timeshift on my desktop as a go back in time solution in case an update breaks something. I don't do data backup on my desktop, since I save all my files to my nfs share which I have mounted on my desktop system. I use borgbackup to backup all the file on my nfs server so that if I need to restore I file I am able to, not perfect but better than no backups.
 
I'd use clonezilla and then burn the compressed imsge onTo DVD right after it's done.
 
I'd use clonezilla and then burn the compressed imsge onTo DVD right after it's done.
does that mean, that you don't have backups? ;) also your plan has a flaw, most harddrives / ssd's have more than 4.7GB
 
clonezilla image iS the backup. If xomnpressed image is too large for DVD-R then use blueray or double-density or simply put it on an external hdd.

It's the concept of getting on a medium that cannot be changed or corrupted as easily as some USB drives seem to be prone.
 
What @forester uses are os back-ups using clonezilla, but I think @tinfoil-hat is asking don't you do data/file backups?
 
I only do incremental backups (using clonezilla) when I am, say getting a Deboian Sid installation here I want it to be but worry about corrupting the syustem when I install a relatively unntesated software . Timeshift may be handier, but I do nopt use it.

Am retired, but when on the job we had tape backuyps.
 
What @forester uses are os back-ups using clonezilla, but I think @tinfoil-hat is asking don't you do data/file backups?
He wrote "I'd use clonezilla..." so I was assuming he didn't use backups. I also thought he was suggesting me to burn my data on a single DVD. But let's not get hung up on the little things
 
With main desktop PC, 'rsync' is used to incrementally backup the files and directories in the data partition to three external 2.5in USB drives (grandfather,father,son). An individually switched USB3.0 hub powers up the appropriate drive at backup time.

I've less concern about the OS partition, it's just saved occasionally as a 'rescuezilla' compressed image file. Rescuezilla is booted from a USB stick and can roll back a previously saved OS image in minutes.

Same methods are adopted with the samba media server. Preference is to acquire as many 2.5in backup drives as necessary for switched triple backups. Takes a long time to rip a CD and DVD collection so only wnat to do it once.
 
With main desktop PC, 'rsync' is used to incrementally backup the files and directories in the data partition to three external 2.5in USB drives (grandfather,father,son). An individually switched USB3.0 hub powers up the appropriate drive at backup time.
I am curious. How exactly did you do this?
 
"Litle things?" Backup is a little thing?
You asked. Let me ask -- do you have a clue about the CLI?
 
"Litle things?" Backup is a little thing?
You asked. Let me ask -- do you have a clue about the CLI?
What I was trying to say, is not to fight over details, when we are misunderstanding each other. I was trying to avoid bad mood. No reason to get personal. Do I have a clue about the CLI? Yes, indeed! I worked as Linux Admin in a major Cancer Research Facility. You come very quick to unfoundated conclusions
 
After about 30 years with Linux and a decade of digital photography, /home is about 1.4 TB on an 2TB SSD. I keep everything critical to backup in /home. Even /usr/local is a bind mount of /home/local. My digikam SQL database is on fast nvme, but each day it is is tarred up into a folder on /home. All I need to do is make sure /home is safe.

I do daily decremental backups of /home to a onboard 3TB hard drive (which only mounted and rotating when doing backups) :

Code:
nice ionice -c 3 rsync -a  --one-file-system --no-compress --delete --force --delete-excluded --exclude-from /usr/local/etc/rsync-exclude.conf -b -HAX --sparse --stats --backup-dir=/mnt/backup/home/past/$ymdhms /home/ /mnt/backup/home/current/

This is fast and simple, and it gives me the ability to find old copies of files in /mnt/backup/home/past/. There is a slight risk of atomicity issues, I mainly deal with that by timing rsyncs to likely down times.

Once a week I run a similar backup to one of four external USB 4TB drives. The main difference is these external backups are on encrypted filesystems. I have to be careful about how fast I feed the USB drives because they SMR hard-drives,. To avoid them slowing down to a crawl, I throttle the rsync with --bwlimit=50M. These external drives get stored away from the house. They're about the size of a pack of cards, get stored in a bubble-wrap postal bag, dry-sack, and then in a waterproof box.

I do similar for the OS partitions, but with no --backup-dir folder, just an rsync to a separate partition. If there has been a kernel update, the OS backup automatically stops running for a couple of days - just in case there are any driver issues. The OS backup is a nice to have, I can easily get back into action by installing the OS from scratch.
 
Last edited:
I am curious. How exactly did you do this?
I use one of the commands below as needed with Xubuntu. Need to wait for USB disk to automount before executing and ensure final forward slashes are present.

rsync -rtxl --delete --progress /media/<user>/SSD-P2/ /media/<user>/USB-BACKUP1/
rsync -rtxl --delete --progress /media/<user>/SSD-P2/ /media/<user>/USB-BACKUP2/
rsync -rtxl --delete --progress /media/<user>/SSD-P2/ /media/<user>/USB-BACKUP3/

If I remember correctly, I also set, one time, ownership and permissions in the media directory for the USB drives, such as:
chown -Rf <user>:<user> /media/<user>
chmod -Rf 755 /media/<user>
 
For me it's Timeshift set to take one snapshot daily and Foxclone to create an image of my Drive which includes everything and is stored on an External HDD.
m1213.gif


Here's an example...On my spare 500GB SSD I have Mint Cinnamon 20.3 and wanted to try Cinnamon 21.1...so I created an image with Foxclone...then installed 21.1 which was a shoddy ISO with many problems.
t2307.gif


So I just put the 20.3 image back on the SSD with nothing lost...then I downloaded 21.1 from a good link and installed it and runs just fine...Foxclone also has a verifying tool...so you know the image is good.
m1212.gif
 
... fine...Foxclone also has a verifying tool...so you know the image is good.
m1212.gif
Verification not part of my daily backup routine because, for 1.4 TB, it takes too long. Instead, I occasionally run rsync --dry-run --verbose --checksum to see if anything unexpected has changed. Doing that check comparing the /home SSD and the internal backup drive this takes about 5.5 hours.

With external USB drives it's even slower. With the external drives I sometimes do a bit of sampling. I'll check something old, important and static, such as a year of photos and see if they are still the same.

I have considered switching to btrfs for increased chances of bit-rot detection, but I still favour ext4 because it seems to be more robustly recoverable in the event of bad things happening.

In the old days when I looked after a few systems, I always did a verify pass on the tapes, but 5.5 hours of verification each day on the desktop is a bit much.
 
I have a 500GB SSD running Mint Cinnamon 21.1...used space on Drive is 157GB.

Foxclone took 24 mins to create an image and 18 mins to Verify it...times have changed.
m1213.gif
 

Staff online


Top