Command to archive alot of files quickly and securelly

P

postcd

Guest
What command i need to use to create one file out of around 500 000 files totaling around 50gb of data? the way that no one without key or password can extract its contents?

THe command must be super fast because i want to do these backups daily. I dont care if its uncompressed, its just must be fast, resource lightweight and secure.
 


I would use a combination of tar and gpg.
Code:
tar cv /path/to/files/* /path/to/tar/file/filename.tar && gpg -c /path/to/tar/file/filename.tar

There are two problems though. One is the lack of compression. Two is that tar is single threaded. If you wish to compress ( you said it was not needed but...) try using pigz a multi threaded gzip.
http://www.zlib.net/pigz/

you may also want to look into the parallel command that (tries) to run a command with a certain number of jobs. IE attempts to run it parallel:
Code:
parallel -j 3 -- "tar cv /path/to/files/* /path/to/tar/file/filename.tar" && gpg -c /path/to/tar/file/filename.tar
 
What that gpg part of your command do?
i somehow generated some key (at least i hope so) but i dont know where is that key, it just shows it by " gpg --list-keys" command.

your command does things verbose, and i did not find how to make it do silent?
 
You need to add at the end of your command "> /dev/null 2>&1"
 
I have:
time tar cv /backup/incremental /backup/b.tar && gpg -c /backup/b.tar >/dev/null 2>&1

But it returns loads of output
 
Code:
"tar c /backup/incremental /backup/b.tar && gpg -c /backup/b.tar" >/dev/null 2>&1
the time command will always generate output. gpg always has to ask for a password. All encryption systems require some kind of output.
 

Members online


Latest posts

Top