I suggest using a multiprocessing.pool.Pool
to limit the number of processes that are run concurrently (since you have indicated there might be a very large number of them).
If you use its apply_async()
method to "submit" tasks to it, you can use the optional callback
argumet to specify a function that will get called whenever one of the subprocesses finishes. This provides a way to terminate further processing by other processes submitted to the pool
.
import multiprocessing as mpimport requestsdef worker(i): try: response = requests.get(f'https://ru.hexlet.io/{i}') if response.status_code == 200: return i except Exception as exc: print(f'{i!r} caused {exc}') return Noneif __name__ == '__main__': def notify(i):"""Called when a Pool worker process finishes execution.""" if i is not None: print(f'{i!r} worked') pool.terminate() # Stops worker processes immediately. pool = mp.Pool() for i in [1,2,3,4,5,6,'courses',8,9,10]: pool.apply_async(worker, (i,), callback=notify) pool.close() pool.join() print('fini')
Output:
'courses' workedfini