Backup strategies for Digital Data Hypochondriac

data-backup

I don’t know if the expression “Digital Data Hypochondriac” is clear enough. They are people always worried to lose all their digital data. Every time any components of their digital ecosystem doesn’t work as expected they’re scared.

I’m one of them. In 2000, when I was 15, my huge 14 GB hard disk contains almost all my digital life. One day the HD controller burn (literally, because of a short circuit) and I lost everything. EVERYTHING 🙁

Read More

Delete a huge amount of files without killing your server

Many high-traffic web applications take advantages from caching systems. HTML cache is easy and powerful. IMHO best solution is serving it using something like nginx and Varnish. Many people use custom solutions (for example a WordPress plugin) which produce an HTML snapshot of the page and save it to disk.

If you website il quite large in a flash you get a huge amount of small files stored on your disk. Clean cache is not easy as you would hope.

First solution is to use:

rm -Rf [path]

But this is dangerous because IO wait is terrible (even with SSD) and load average of you machine rise up to 50 in a minute.

find can help you:

find [path] -type f -print -delete

Unfortunately, after a while, load average rise again. Rising is slower but is still dangerous.

Solution is to use ionice. It enable you to limit priority of your process and avoid load average rise.

ionice -c 3 find [path] -type f -print -delete

Thanks to @dani_viga for the tips! He saves my day 🙂

More about ionice