76

On my website I use PHP sessions. Session information is stored in files in my ./session path. After a few months I discovered that these session files are never deleted, by now there are 145.000 of them in this directory.

How should these be cleaned up? Do I have to do it programmatically, or is ther a setting I can use somewhere that would have this cleanup happen automatically?

EDIT forgot to mention: This site runs at a provider, so I don't have access to a command line. I do have ftp-access, but the session files belong to another user (the one the webserver proces runs I guess) From the first answers I got I think it's not just a setting on the server or PHP, so I guess I'll have to implement something for it in PHP, and call that periodically from a browser (maybe from a cron job running on my own machine at home)

Jack
  • 978
  • 1
  • 7
  • 15

10 Answers10

66

To handle session properly, take a look at http://php.net/manual/en/session.configuration.php.

There you'll find these variables:

  • session.gc_probability
  • session.gc_divisor
  • session.gc_maxlifetime

These control the garbage collector (GC) probability of running with each page request.

You could set those with ini_set() at the beginning of your script or .htaccess file so you get certainty to some extent they will get deleted sometime.

T30
  • 11,422
  • 7
  • 53
  • 57
Seb
  • 24,920
  • 5
  • 67
  • 85
  • 12
    NOTE: If you are using the subdirectory option for storing session files (see session.save_path above), then garbage collection does *not* happen automatically. You will need to do your own garbage collection through a shell script, cron entry, or some other method. For example, the following script would is the equivalent of setting session.gc_maxlifetime to 1440 (1440 seconds = 24 minutes): cd /path/to/sessions; find -cmin +24 | xargs rm – Jehy Oct 21 '14 at 12:28
  • 2
    @Jehy It looks like PHP may now perform garbage collection even when you have a custom session.save_path. – Darrell Brogdon Jun 23 '16 at 21:13
  • 1
    @DarrellBrogdon Do you have a resource or more information on the automatic garbage collection of PHP when using a custom session.save path? This would be nice if it was the case. – chocolata Oct 27 '16 at 14:44
  • 1
    @maartenmachiels The PHP documentation and php.ini file actually provide a lot of information. You have to look closely, however, and really understand the calculation the GC performs to determine when and what to collect. – Darrell Brogdon Oct 27 '16 at 15:53
38

Debian/Ubuntu handles this with a cronjob defined in /etc/cron.d/php5

# /etc/cron.d/php5: crontab fragment for php5
#  This purges session files older than X, where X is defined in seconds
#  as the largest value of session.gc_maxlifetime from all your php.ini
#  files, or 24 minutes if not defined.  See /usr/lib/php5/maxlifetime

# Look for and purge old sessions every 30 minutes
09,39 *     * * *     root   [ -d /var/lib/php5 ] && find /var/lib/php5/ -type f -cmin +$(/usr/lib/php5/maxlifetime) -print0 | xargs -r -0 rm

The maxlifetime script simply returns the number of minutes a session should be kept alive by checking php.ini, it looks like this

#!/bin/sh -e

max=1440

for ini in /etc/php5/*/php.ini; do
        cur=$(sed -n -e 's/^[[:space:]]*session.gc_maxlifetime[[:space:]]*=[[:space:]]*\([0-9]\+\).*$/\1/p' $ini 2>/dev/null || true);
        [ -z "$cur" ] && cur=0
        [ "$cur" -gt "$max" ] && max=$cur
done

echo $(($max/60))

exit 0
Paul Dixon
  • 295,876
  • 54
  • 310
  • 348
  • Thanks for your answer Paul, as I said in the lastest version of my question it'not an option to run it from the command line, I'll see if I can convince my provider to put something like this in the cron. – Jack Mar 19 '09 at 11:45
  • Thanks to this, I found that someone must have cut & pasted this to ineffectiveness. The & in the command were now &. Server hadn't collected the garbage for over 2 years. No wonder it smelled so bad around here! – Josiah Nov 03 '14 at 21:55
  • So this completely bypasses anything that is set via ini_set? No wonder setting gc in the script does nothing. – styks Mar 12 '15 at 20:36
  • 4
    The cron entry only cleans up the session files in /var/lib/php5. The OP is asking about cleaning session files in his custom session directory. So you need to create a script that goes through all your custom directories, and does the same as what @paul-dixon wrote here for the cron entry. – Wouter Thielen Mar 23 '15 at 03:11
  • I'm curious why Debian/Ubuntu does this? In our case, we had a custom session.save_path so the cron.d script wasn't seeing any session files. We had so many stale session files build up that we ran out of inodes! Thankfully, setting session.gc_probability back to 1 (Ubuntu has it set to 0 by default) rectified the problem. – Darrell Brogdon Jun 23 '16 at 21:17
  • Debian/Ubuntu does this because only root can modify/delete files in /var/lib/php5 for security reasons. Personally, I much prefer session cleanup via cron job compared to some randomly called garbage collector - it is more consistent and comprehensible. – iquito Jun 14 '19 at 10:15
  • Any idea why `-print0 | xargs rm` -pair instead of simple `-delete`? I just enabled similar job to a PHP session directory containing 6 million files. Anything else than `find` would crash. – Jari Turkia Mar 08 '22 at 09:09
36

In case someone want's to do this with a cronjob, please keep in mind that this:

find .session/ -atime +7  -exec rm {} \;

is really slow, when having a lot of files.

Consider using this instead:

find .session/ -atime +7 | xargs -r rm

In Case you have spaces in you file names use this:

find .session/ -atime +7 -print0 | xargs -0 -r rm

xargs will fill up the commandline with files to be deleted, then run the rm command a lot lesser than -exec rm {} \;, which will call the rm command for each file.

Just my two cents

BlitZ
  • 12,038
  • 3
  • 49
  • 68
Andi
  • 551
  • 4
  • 7
  • 7
    Also keep in mind that you can use the `-exec $cmd {} +;` rather than `-exec $cmd {} \;` It will execute all the files found within one command rather than running the command one time for each file found. This is very similar to xargs as `Andi` has recommended. Read more about it here: https://unix.stackexchange.com/questions/195939/what-is-meaning-of-in-finds-exec-command – domdambrogia Apr 27 '17 at 20:38
  • in my case in docker container (without CRON initialy), find… | xargs was too slowly, I should run `find .session/sess_[a-d] -atime +7 | xargs -r rm` and so on… – bcag2 Dec 14 '20 at 08:12
  • What about the `-delete` parameter of `find`? So I like to use just this command, which much more safer. I.e.: `find /var/lib/php/session2/ -atime +30 -delete` Also have a look at: https://unix.stackexchange.com/questions/167823/finds-exec-rm-vs-delete – malisokan Aug 31 '23 at 12:36
9

cd to sessions directory and then:

1) View sessions older than 40 min: find . -amin +40 -exec stat -c "%n %y" {} \;

2) Remove sessions older than 40 min: find . -amin +40 -exec rm {} \;

David Lefkon
  • 289
  • 4
  • 6
  • 2
    Thanks, this was helpful. I combined "Step 2" with Andi's code to make this, which was faster for me (I was at 100%): `find . -amin +40 | xargs -r rm` – Tom Walker Jul 13 '18 at 21:28
5

You can create script /etc/cron.hourly/php and put there:

#!/bin/bash

max=24
tmpdir=/tmp

nice find ${tmpdir} -type f -name 'sess_*' -mmin +${max} -delete

Then make the script executable (chmod +x).

Now every hour will be deleted all session files with data modified more than 24 minutes ago.

Daniel Milde
  • 1,096
  • 1
  • 12
  • 15
5
# Every 30 minutes, not on the hour<br>
# Grabs maxlifetime directly from \`php -i\`<br>
# doesn't care if /var/lib/php5 exists, errs go to /dev/null<br>

09,39 * * * *   find /var/lib/php5/ -type f -cmin +$(echo "\`php -i|grep -i session.gc_maxlifetime|cut -d' ' -f3\` / 60" | bc) -exec rm -f {} \\; >/dev/null 2>&1

The Breakdown: Only files: find /var/lib/php5/ -type f
Older than minutes: -cmin
Get php settings: $(echo "`php -i|grep -i session.gc_maxlifetime
Do the math: |cut -d' ' -f3` / 60" | bc)
RM matching files: -exec rm -f {} \;

Wesley Bland
  • 8,816
  • 3
  • 44
  • 59
Dr. Tyrell
  • 2,829
  • 1
  • 16
  • 9
5

Use cron with find to delete files older than given threshold. For example to delete files that haven't been accessed for at least a week.

find .session/ -atime +7  -exec rm {} \;
vartec
  • 131,205
  • 36
  • 218
  • 244
  • Thanks for your answer, as I said in the lastest version of my question it'not an option to run it from the command line, I'll see if I can convince my provider to put something like this in the cron. – Jack Mar 19 '09 at 11:44
2

My best guess would be that you are on a shared server and the session files are mixed along all users so you can't, nor you should, delete them. What you can do, if you are worried about scaling and/or your users session privacy, is to move sessions to the database.

Start writing that Cookie to the database and you've got a long way towards scaling you app across multiple servers when time is due.

Apart from that I would not worry much with the 145.000 files.

Frankie
  • 24,627
  • 10
  • 79
  • 121
  • Thanks Frankie, good idea about moving it to the database, will keep that in mind. – Jack Mar 19 '09 at 11:46
  • Just beware, saving your session in the database is significantly slower. If your site is not super busy you probably won't notice, but on a busy site it will be noticeably slower and can bog things down. – Vincent Jul 26 '23 at 20:01
  • @Vincent as your site scales, you'll be forced to use a multitude of servers. You could pin sessions to servers, but eventually you'll have to shut down users mid-session to reboot a server. Besides, if you use REDIS or other memory based DB, access will be a magnitude faster than on disk. Nonetheless, do look at DB queries. When DB starts slowing things up, it's generally a very unoptimized query. – Frankie Jul 28 '23 at 01:34
0

Use below cron:

39 20     * * *     root   [ -x /usr/lib/php5/maxlifetime ] && [ -d /var/lib/php5 ] && find /var/lib/php5/ -depth -mindepth 1 -maxdepth 1 -type f -cmin +$(/usr/lib/php5/maxlifetime) -print0 | xargs -r -0 rm
slartidan
  • 20,403
  • 15
  • 83
  • 131
0
find /var/lib/php/sessions/ -atime +7 -delete

will perform better because it doesn't have to spawn an external process for each and every matched file

Kasyful Anwar
  • 355
  • 2
  • 3