I have the nested loops below. How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script?
for r in range(4):for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):# - Write Abaqus INP file - #writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])# - Delete LCK file to Enable Another Analysis - #delFile(aPath[k]+"/"+inpFiles[k]+".lck")# - Run Analysis - #runABQfile(inpFiles[k],aPath[k])
I tried using multiprocess.pool
as but it never gets in:
def parRunABQfiles(nA,nP,r,ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_):from os import path from auxFunctions import writeABQfile, runABQfile print("I am Here")for k in range( r*nA/nP, (r+1)*nA/nP ):# - Write Abaqus INP file - #writeABQfile(ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_)# - Delete LCK file to Enable Another Analysis - #delFile(aPath_+"/"+inpFiles[k]+".lck")# - Run Analysis - #runABQfile(inpFiles_,aPath_)# - Make Sure Analysis is not Bypassed - #while os.path.isfile(aPath_+"/"+inpFiles[k]+".lck") == True:sleep(0.1)return kresults = zip(*pool.map(parRunABQfiles, range(0, 4, 1)))
The runABQfile
is just a subprocess.call to a sh script that runs abaqus
def runABQfile(inpFile,path): import subprocessimport osprcStr1 = ('sbatch '+path+'/runJob.sh')process = subprocess.call(prcStr1, stdin=None, stdout=None, stderr=None, shell=True )return
I have no errors showing up so I am not sure why is not getting in there. I know because the writeABQfile
does not write the input file. The question again is:
How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script?