Hello,
my hosting control panel can backup hosting files this ways
uncompressed
compressed (tar.gz)
or incremental non compressed
there is huge amount of files, totalling around 30gb of data maybe.
i wish to keep local backup and also do external copy (ftp, scp). can you help tell what is most effective way (least server resources usage)? mainly im cornerned about minimal disk i/o and cpu.
now im taring the incremental backup files (12gb of files) and it takes around 2 hours to bakcup it. so not good probably? so what way to have local backup + external with least server resources?
wouldn’t rsync with the --bwlimit (limit bandwidth to [n]KBps) option work ?
From a disk I/O perspective, you’re not going to win - at SOME point, you will have to read that 30Gb. If you’re not worried about network bandwidth or remote storage capacity, just rsync it uncompressed, with a low nice value to keep CPU usage under wraps.
If you need to lower the remote storage usage, you’re going to have to tar it - you can run tar with a low nice value to keep CPU usage under control, and do it “out-of-hours” to minimize impact.
You could look at different I/O schedulers, and find one that suits that workload a little better (one that practices the equivalent “pre-emptive multitasking” for I/O)
thx, what is i need to protect data from unauthorized reading/use. Then i cant use plan copying/rsnyc right, what are my best options then while considering low i/o and low cpu usage?
In order to encrypt it (your only real option is encryption, other methods could be broken too easily), you’re going to take a CPU hit, there’s no way around it. You could get a CPU with the AES-NI instruction set, then use an app that will utilise them, but ultimately encryption is a resource-intensive process. All you can do is use a low nice value for the encryption, so it’s CPU usage is throttled - this will just make the encryption take longer though 
Thx, anyone can please share how such scp or ftp backup command with encryption should look like?