Delete a huge amount of files without killing your server

Many high-traffic web applications take advantages from caching systems. HTML cache is easy and powerful. IMHO best solution is serving it using something like nginx and Varnish. Many people use custom solutions (for example a WordPress plugin) which produce an HTML snapshot of the page and save it to disk.

If you website il quite large in a flash you get a huge amount of small files stored on your disk. Clean cache is not easy as you would hope.

First solution is to use:

rm -Rf [path]

But this is dangerous because IO wait is terrible (even with SSD) and load average of you machine rise up to 50 in a minute.

find can help you:

find [path] -type f -print -delete

Unfortunately, after a while, load average rise again. Rising is slower but is still dangerous.

Solution is to use ionice. It enable you to limit priority of your process and avoid load average rise.

ionice -c 3 find [path] -type f -print -delete

Thanks to @dani_viga for the tips! He saves my day đŸ™‚

More about ionice