Python multiprocessing blocks indefinately in waiter.acquire()

2024/9/19 17:09:12

Can someone explain why this code blocks and cannot complete?

I've followed a couple of examples for multiprocessing and I've writting some very similar code that does not get blocked. But, obviously, I cannot see what is the difference between that working code and that below. Everything sets up fine, I think. It gets all the way to .get(), but none of the processes ever finish.

The problem is that python3 blocks indefinitely in waiter.acquire(), which you can tell by interrupting it and reading the backtrace.

$ python3 ./try415.py
^CTraceback (most recent call last):File "./try415.py", line 43, in <module>ps = [ res.get() for res in proclist ]File "./try415.py", line 43, in <listcomp>ps = [ res.get() for res in proclist ]File "/usr/lib64/python3.6/multiprocessing/pool.py", line 638, in getself.wait(timeout)File "/usr/lib64/python3.6/multiprocessing/pool.py", line 635, in waitself._event.wait(timeout)File "/usr/lib64/python3.6/threading.py", line 551, in waitsignaled = self._cond.wait(timeout)File "/usr/lib64/python3.6/threading.py", line 295, in waitwaiter.acquire()
KeyboardInterrupt

Here's the code

from multiprocessing import Pool
from scipy import optimize
import numpy as npdef func(t, a, b, c):return 0.5*a*t**2 + b*t + cdef funcwrap(t, params):return func(t, *params)def fitWithErr(procid, yFitValues, simga, func, p0, args, bounds):np.random.seed() # force new seedrandomDelta = np.random.normal(0., sigma, len(yFitValues))randomdataY = yFitValues + randomDeltaerrfunc = lambda p, x, y: func(p, x) -yoptResult = optimize.least_squares(errfunc, p0, args=args, bounds=bounds)return optResult.xdef fit_bootstrap(function, datax, datay, p0, bounds, aprioriUnc):errfunc = lambda p, x, y: function(x,p) - yoptResult = optimize.least_squares(errfunc, x0=p0, args=(datax, datay), bounds=bounds)pfit = optResult.xresiduals = optResult.funfity = function(datax, pfit)numParallelProcesses = 2**2 # should be equal to number of ALUsnumTrials = 2**2 # this many random data sets are generated and fittedtrialParameterList = list()for i in range(0,numTrials):trialParameterList.append( [i, fity, aprioriUnc, function, p0, (datax, datay), bounds] )with Pool(processes=numParallelProcesses) as pool:proclist = [ pool.apply_async(fitWithErr, args) for args in trialParameterList ]ps = [ res.get() for res in proclist ]ps = np.array(ps)mean_pfit = np.mean(ps,0)return mean_pfitif __name__ == '__main__':x = np.linspace(0,3,2000)p0 = [-9.81, 1., 0.]y = funcwrap(x, p0)bounds = [ (-20,-1., -1E-6),(20,3,1E-6) ]fit_bootstrap(funcwrap, x, y, p0, bounds=bounds, aprioriUnc=0.1)
Answer

Sorry for giving out the wrong answer. It's so irresponsible for not verify it. Here is the answer from me.

with Pool(processes=numParallelProcesses) as pool:

This line is wrong as with will call exit function not close. Here is exit function body:

    def __exit__(self, exc_type, exc_val, exc_tb):self.terminate()

All of the process will be terminated and never excuted. Code:

ps = [ res.get() for res in proclist ]

there is no timeout parameter. Here is the get function body:

def get(self, timeout=None):self.wait(timeout)if not self.ready():raise TimeoutErrorif self._success:return self._valueelse:raise self._value

It will always wait if no timeout. That's why it hang.

You need to change

with Pool(processes=numParallelProcesses) as pool:proclist = [ pool.apply_async(fitWithErr, args) for args in trialParameterList ]

to:

pool=Pool(processes=numParallelProcesses)
proclist = [ pool.apply_async(fitWithErr, args) for args in trialParameterList ]
pool.close()
https://en.xdnf.cn/q/72356.html

Related Q&A

What is the best way to control Twisteds reactor so that it is nonblocking?

Instead of running reactor.run(), Id like to call something else (I dunno, like reactor.runOnce() or something) occasionally while maintaining my own main loop. Is there a best-practice for this with …

Accessing the content of a variable array with ctypes

I use ctypes to access a file reading C function in python. As the read data is huge and unknown in size I use **float in C . int read_file(const char *file,int *n_,int *m_,float **data_) {...}The func…

What is the stack in Python?

What do we call "stack" in Python? Is it the C stack of CPython? I read that Python stackframes are allocated in a heap. But I thought the goal of a stack was... to stack stackframes. What …

Pandas: Resample dataframe column, get discrete feature that corresponds to max value

Sample data:import pandas as pd import numpy as np import datetimedata = {value: [1,2,4,3], names: [joe, bob, joe, bob]} start, end = datetime.datetime(2015, 1, 1), datetime.datetime(2015, 1, 4) test =…

How to filter string in multiple conditions python pandas

I have following dataframeimport pandas as pd data=[5Star,FiveStar,five star,fiv estar] data = pd.DataFrame(data,columns=["columnName"])When I try to filter with one condition it works fine.d…

Is there a way to use a dataclass, with fields with defaults, with __slots__

I would like to put __slots__ on a dataclass with fields with defaults. When I try do that, I get this error: >>> @dataclass ... class C: ... __slots__ = (x, y, ) ... x: int ... y:…

Read remote file using python subprocess and ssh?

How can I read data from a big remote file using subprocess and ssh?

Django - get_queryset() missing 1 required positional argument: request

I was trying to make an API using REST Framework for uploading a file to the server and my codes are below.If you have any other easy method to do the same please post your code.models.pyfrom django.db…

Storing elements of one list, in another list - by reference - in Python?

I just thought Id jot this down now that Ive seen it - it would be nice to get a confirmation on this behavior; I did see How do I pass a variable by reference?, but Im not sure how to interpret it in…

Joining Two Different Dataframes on Timestamp

Say I have two dataframes:df1: df2: +-------------------+----+ +-------------------+-----+ | Timestamp |data| | Timestamp |stuff| +-------------------+---…