This is my first post in this category as the original function of this blog was to explode the secrets I know. But… I cannot know everything and sometimes I also have questions that require hours of searching. I hope to find the answer in some comments that should appear as I do already have visitors from search engines.
Here is what I need: I’ve got the folder that contains a very big number of files (I really don’t know how many files are there as ls | wc -l overloads my server). I think there are several millions of temporary files that are no longer necessary and I need to delete them.
Every operation tried overloads my server, it starts working slower and slower and once I even had to reboot it. Here is what I tried:
rm -f * – certainly does not work as the number of files is too high
ls | grep .| xargs rm – deletes some files but increases server load
find . -type f | while read -r; do rm -v “$REPLY”; sleep 0.2; done – same result as previous
rm -rf folder/; mkdir folder; – no visible result
rm -f <various patterns combinated> – seem to be working, the load is lower but It is impossible to run a server too. It also requires a lot of patterns and is not a beautiful solution :)
If you have any ideas how to do it I will greatly appreciate your help. Waiting for your comments…
Found The Solution that works without server overload: I’ve started the deletion process from mc and it seems to be working great. I don’t know how much time will be necessary to delete a plenty of files I have but now I can work and delete at once. So I think the problem is solved.
Found The Solution that works without server overload: I’ve started the deletion process from mc and it seems to be working great. I don’t know how much time will be necessary to delete a plenty of files I have but now I can work and delete at once. So I think the problem is solved.