When downloading files from the web I usually have a predefined folder where all the downloads from my different
browsers are stored. It doesn't take long till that folder grows to quite a

remarkable size. Sorting out the
files which are still needed is then a cumbersome and time-consuming process. Therefore,
years ago I created a Java console app which takes care of removing files of a certain age.
Since I
own a
Mac this has become even more easy.
For instance, to delete all files from a folder which haven't been accessed for 90 days and more, all you have to do is
to create an automator script that executes the following unix shell command:
find /Users/Juri/Downloads/* -type f -atime +90 -exec rm -f {} \;
Note the usage of the
parameter
-atime
. According to the man page
-atime n
File was last accessed n*24 hours ago. When find figures out
how many 24-hour periods ago the file was last accessed, any
fractional part is ignored, so to match -atime +1, a file has to
have been accessed at least two days ago.
This
parameter assures that files which you often touch (open/modify) will not be deleted from the folder. It would be
somehow wrong to use
-mtime
.
In addition after the overall cleaning process of the files you
should get rid of empty folders which is done by invoking
find /Users/Juri/Downloads/* -type d -depth -empty -exec rmdir {} \;
Now you can combine
everything in an Automator script and launch it at login time and voilĂ . From now on you don't have to care any more
about cleaning your downloads folder :)
(P.S.: For
future reference you may also like to redirect log entries into a file in order to know which files have been
deleted)
Questions? Thoughts? Hit me up
on Twitter