For downscaling, area-averaging (see Mark's answer) is close to the best you'll get.
The main other contender is gaussian, with a slightly larger radius. This will increase blurring a little bit, which could be seen as a disadvantage, but would make the blurring more uniform rather than dependent on the alignment of pixels mod 2.
In case it's not immediately clear what I mean, consider the pixel patterns 0,0,2,2,0,0 and 0,0,0,2,2,0. With area-averaging, they'd downscale to 0,2,0 and 0,1,1, respectively - that is, one will be sharp and bright while the other will be blurred and dim. Using a longer filter, both will be blurred, but they'll appear more similar, which presumably matters to human observers.
Another issue to consider is gamma. Unless gamma is linear, two pixels of intensity k
will have much less total intensity than a single pixel of intensity 2*k
. If your filter performs sufficient blurring, it might not matter so much, but with the plain area-average filter it can be a major issue. The only work-around I know is to apply and reverse the gamma curve before and after scaling...