All other things being equal, there should be no difference between running several sequential jobs either manually or through a batch script.
Since you have multiple cores/CPUs on your machine, you should gain from running multiple jobs in parallel. The gain depends on file system throughput and how much time FSL spends in CPU vs. disk. Perhaps you can divide your list of jobs into 2 or 3 groups and run the groups in parallel (each group running several sequential jobs).
There are other more complicated ways to script a maximum of N simultaneous jobs, but these are easy to mess up and run out of control. For example, I use the following template:#!/bin/bash
MAXPROCS=2
PIDS=()
# here is the main loop
for i in XXX YYY ZZZ etc. ; do
# do any set up for the background process here
# ...
# now, before running anything in the background,
# check to see if we need to wait for previous
# processes to finish
if [ ${#PIDS[@]} -ge ${MAXPROCS} ] ; then
wait ${PIDS[0]}
unset PIDS[0]
PIDS=("${PIDS[@]}")
fi
# at this point, we know we can run another background process
run_this_in_background.sh &
# DON'T FORGET to add the background process' ID to the process list
PIDS[${#PIDS[@]}]=$!
done
The "top" command is useful to see what processes are taking up time. It (along with "ps -ef") is also potentially useful to see if there are actually multiple jobs running when you thought you were only running one.