If hardware is not a limiting factor, what's the fastest way to take a large amount of high-res jpeg images and downsize them all? For example, if I have a folder of 20,000 jpeg images that vary in aspect ratio, but are all fairly large (near 4k resolution), and I'd like to resize every image to 512x512.
I've tried Python pillow-simd with libjpegturbo and multiprocessing on a machine with a pretty beefy CPU and a V100 GPU (although not utilized I believe), and it still takes something like 90 minutes to complete this job.
Does anyone know of an image downsizing method that can take advantage of a powerful GPU or has some other significant speed optimizations? Or is this really the current state-of-the-art for image downsizing speed?