Python multiprocessing.queues.SimpleQueue() Examples
The following are 4
code examples of multiprocessing.queues.SimpleQueue().
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example.
You may also want to check out all available functions/classes of the module
multiprocessing.queues
, or try the search function
.
Example #1
Source File: __init__.py From jawfish with MIT License | 5 votes |
def SimpleQueue(): ''' Returns a queue object ''' from multiprocessing.queues import SimpleQueue return SimpleQueue()
Example #2
Source File: parallel.py From lkpy with MIT License | 5 votes |
def put(self, obj): bytes = pickle.dumps(obj, protocol=pickle.HIGHEST_PROTOCOL) # follow SimpleQueue, need to deal with _wlock being None if self._wlock is None: self._writer.send_bytes(bytes) else: with self._wlock: self._writer.send_bytes(bytes)
Example #3
Source File: parallel.py From lkpy with MIT License | 5 votes |
def SimpleQueue(self): return FastQ(ctx=self.get_context())
Example #4
Source File: parallel.py From lkpy with MIT License | 5 votes |
def run_sp(func, *args, **kwargs): """ Run a function in a subprocess and return its value. This is for achieving subprocess isolation, not parallelism. The subprocess is configured so things like logging work correctly, and is initialized with a derived random seed. """ ctx = LKContext.INSTANCE rq = ctx.SimpleQueue() seed = derive_seed(none_on_old_numpy=True) worker_args = (log_queue(), seed, rq, func, args, kwargs) _log.debug('spawning subprocess to run %s', func) proc = ctx.Process(target=_sp_worker, args=worker_args) proc.start() _log.debug('waiting for process %s to return', proc) success, payload = rq.get() _log.debug('received success=%s', success) _log.debug('waiting for process %s to exit', proc) proc.join() if proc.exitcode: _log.error('subprocess failed with code %d', proc.exitcode) raise RuntimeError('subprocess failed with code ' + str(proc.exitcode)) if success: return payload else: _log.error('subprocess raised exception: %s', payload) raise ChildProcessError('error in child process', payload)