#!/bin/bash
data_dir=./all
for file_name in "$data_dir"/*
do
echo "$file_name"
python process.py "$file_name"
done
For example, this script processes the files sequentially in a directory in a 'for' loop. Is it possible to start multiple process.py instances to process files concurrently? I want to do this in a shell script.