python 2.7 - Multiprocessing, how to run processes in parallel without creating zombies? -


i'd run processes in parallel, commented out p.join __main__ section. consequences of not have .join, or better yet, should using different approach parallel multiprocessing?

import multiprocessing  def worker(num):     x = 0     in range(10000):         x+=1     print x, num  if __name__ == '__main__':      in range(4):         p = multiprocessing.process(target=worker, args=(i,))         p.start()         # p.join() 

join processes after starting them.

if __name__ == '__main__':     procs = []     in range(4):         p = multiprocessing.process(target=worker, args=(i,))         p.start()         procs.append(p)     p in procs:         p.join() 

if run multiple similar tasks, can use multiprocessing.pool.

if __name__ == '__main__':     pool = multiprocessing.pool()     pool.map(worker, range(4))     pool.close()     pool.join() 

Comments

Popular posts from this blog

scala - 'wrong top statement declaration' when using slick in IntelliJ -

c# - DevExpress.Wpf.Grid.InfiniteGridSizeException was unhandled -

PySide and Qt Properties: Connecting signals from Python to QML -