👻 Check our latest review to choose the best laptop for Machine Learning engineers and Deep learning tasks!
The documentation for the
multiprocessing module shows how to pass a queue to a process started with
multiprocessing.Process. But how can I share a queue with asynchronous worker processes started with
apply_async? I don"t need dynamic joining or anything else, just a way for the workers to (repeatedly) report their results back to base.
import multiprocessing def worker(name, que): que.put("%d is done" % name) if __name__ == "__main__": pool = multiprocessing.Pool(processes=3) q = multiprocessing.Queue() workers = pool.apply_async(worker, (33, q))
This fails with:
RuntimeError: Queue objects should only be shared between processes through inheritance.
I understand what this means, and I understand the advice to inherit rather than require pickling/unpickling (and all the special Windows restrictions). But how do I pass the queue in a way that works? I can"t find an example, and I"ve tried several alternatives that failed in various ways. Help please?
👻 Read also: what is the best laptop for engineering students?
Sharing a result queue among several processes find: Questions
Finding the index of an item in a list
Given a list
["foo", "bar", "baz"] and an item in the list
"bar", how do I get its index (
1) in Python?
>>> ["foo", "bar", "baz"].index("bar") 1
Reference: Data Structures > More on Lists
Note that while this is perhaps the cleanest way to answer the question as asked,
index is a rather weak component of the
list API, and I can"t remember the last time I used it in anger. It"s been pointed out to me in the comments that because this answer is heavily referenced, it should be made more complete. Some caveats about
list.index follow. It is probably worth initially taking a look at the documentation for it:
list.index(x[, start[, end]])
Return zero-based index in the list of the first item whose value is equal to x. Raises a
ValueErrorif there is no such item.
The optional arguments start and end are interpreted as in the slice notation and are used to limit the search to a particular subsequence of the list. The returned index is computed relative to the beginning of the full sequence rather than the start argument.
Linear time-complexity in list length
index call checks every element of the list in order, until it finds a match. If your list is long, and you don"t know roughly where in the list it occurs, this search could become a bottleneck. In that case, you should consider a different data structure. Note that if you know roughly where to find the match, you can give
index a hint. For instance, in this snippet,
l.index(999_999, 999_990, 1_000_000) is roughly five orders of magnitude faster than straight
l.index(999_999), because the former only has to search 10 entries, while the latter searches a million:
>>> import timeit >>> timeit.timeit("l.index(999_999)", setup="l = list(range(0, 1_000_000))", number=1000) 9.356267921015387 >>> timeit.timeit("l.index(999_999, 999_990, 1_000_000)", setup="l = list(range(0, 1_000_000))", number=1000) 0.0004404920036904514
Only returns the index of the first match to its argument
A call to
index searches through the list in order until it finds a match, and stops there. If you expect to need indices of more matches, you should use a list comprehension, or generator expression.
>>> [1, 1].index(1) 0 >>> [i for i, e in enumerate([1, 2, 1]) if e == 1] [0, 2] >>> g = (i for i, e in enumerate([1, 2, 1]) if e == 1) >>> next(g) 0 >>> next(g) 2
Most places where I once would have used
index, I now use a list comprehension or generator expression because they"re more generalizable. So if you"re considering reaching for
index, take a look at these excellent Python features.
Throws if element not present in list
A call to
index results in a
ValueError if the item"s not present.
>>> [1, 1].index(2) Traceback (most recent call last): File "<stdin>", line 1, in <module> ValueError: 2 is not in list
If the item might not be present in the list, you should either
- Check for it first with
item in my_list(clean, readable approach), or
- Wrap the
indexcall in a
try/exceptblock which catches
ValueError(probably faster, at least when the list to search is long, and the item is usually present.)
One thing that is really helpful in learning Python is to use the interactive help function:
>>> help(["foo", "bar", "baz"]) Help on list object: class list(object) ... | | index(...) | L.index(value, [start, [stop]]) -> integer -- return first index of value |
which will often lead you to the method you are looking for.
The majority of answers explain how to find a single index, but their methods do not return multiple indexes if the item is in the list multiple times. Use
for i, j in enumerate(["foo", "bar", "baz"]): if j == "bar": print(i)
index() function only returns the first occurrence, while
enumerate() returns all occurrences.
As a list comprehension:
[i for i, j in enumerate(["foo", "bar", "baz"]) if j == "bar"]
Here"s also another small solution with
itertools.count() (which is pretty much the same approach as enumerate):
from itertools import izip as zip, count # izip for maximum efficiency [i for i, j in zip(count(), ["foo", "bar", "baz"]) if j == "bar"]
This is more efficient for larger lists than using
$ python -m timeit -s "from itertools import izip as zip, count" "[i for i, j in zip(count(), ["foo", "bar", "baz"]*500) if j == "bar"]" 10000 loops, best of 3: 174 usec per loop $ python -m timeit "[i for i, j in enumerate(["foo", "bar", "baz"]*500) if j == "bar"]" 10000 loops, best of 3: 196 usec per loop
Why is it string.join(list) instead of list.join(string)?
This has always confused me. It seems like this would be nicer:
my_list = ["Hello", "world"] print(my_list.join("-")) # Produce: "Hello-world"
my_list = ["Hello", "world"] print("-".join(my_list)) # Produce: "Hello-world"
Is there a specific reason it is like this?
It"s because any iterable can be joined (e.g, list, tuple, dict, set), but its contents and the "joiner" must be strings.
"_".join(["welcome", "to", "stack", "overflow"]) "_".join(("welcome", "to", "stack", "overflow"))
Using something other than strings will raise the following error:
TypeError: sequence item 0: expected str instance, int found
This was discussed in the String methods... finally thread in the Python-Dev achive, and was accepted by Guido. This thread began in Jun 1999, and
str.join was included in Python 1.6 which was released in Sep 2000 (and supported Unicode). Python 2.0 (supported
str methods including
join) was released in Oct 2000.
- There were four options proposed in this thread:
joinas a built-in function
- Guido wanted to support not only
tuples, but all sequences/iterables.
seq.reduce(str)is difficult for newcomers.
seq.join(str)introduces unexpected dependency from sequences to str/unicode.
join()as a built-in function would support only specific data types. So using a built-in namespace is not good. If
join()supports many datatypes, creating an optimized implementation would be difficult, if implemented using the
__add__method then it would ve
- The separator string (
sep) should not be omitted. Explicit is better than implicit.
Here are some additional thoughts (my own, and my friend"s):
- Unicode support was coming, but it was not final. At that time UTF-8 was the most likely about to replace UCS2/4. To calculate total buffer length of UTF-8 strings it needs to know character coding rule.
- At that time, Python had already decided on a common sequence interface rule where a user could create a sequence-like (iterable) class. But Python didn"t support extending built-in types until 2.2. At that time it was difficult to provide basic
iterableclass (which is mentioned in another comment).
Guido"s decision is recorded in a historical mail, deciding on
Funny, but it does seem right! Barry, go for it...
Guido van Rossum
join() method is in the string class, instead of the list class?
I agree it looks funny.
Historical note. When I first learned Python, I expected join to be a method of a list, which would take the delimiter as an argument. Lots of people feel the same way, and there‚Äôs a story behind the join method. Prior to Python 1.6, strings didn‚Äôt have all these useful methods. There was a separate string module which contained all the string functions; each function took a string as its first argument. The functions were deemed important enough to put onto the strings themselves, which made sense for functions like lower, upper, and split. But many hard-core Python programmers objected to the new join method, arguing that it should be a method of the list instead, or that it shouldn‚Äôt move at all but simply stay a part of the old string module (which still has lots of useful stuff in it). I use the new join method exclusively, but you will see code written either way, and if it really bothers you, you can use the old string.join function instead.
--- Mark Pilgrim, Dive into Python
We hope this article has helped you to resolve the problem. Apart from Sharing a result queue among several processes, check other find-related topics.
By the way, this material is also available in other languages:
- Italiano Sharing a result queue among several processes
- Deutsch Sharing a result queue among several processes
- Français Sharing a result queue among several processes
- Español Sharing a result queue among several processes
- Türk Sharing a result queue among several processes
- Русский Sharing a result queue among several processes
- Português Sharing a result queue among several processes
- Polski Sharing a result queue among several processes
- Nederlandse Sharing a result queue among several processes
- 中文 Sharing a result queue among several processes
- 한국어 Sharing a result queue among several processes
- 日本語 Sharing a result queue among several processes
- हिन्दी Sharing a result queue among several processes
San Francisco | 2022-11-30
Simply put and clear. Thank you for sharing. Sharing a result queue among several processes and other issues with find was always my weak point 😁. I just hope that will not emerge anymore
Boston | 2022-11-30
I was preparing for my coding interview, thanks for clarifying this - Sharing a result queue among several processes in Python is not the simplest one. I am just not quite sure it is the best method
Shanghai | 2022-11-30
I was preparing for my coding interview, thanks for clarifying this - Sharing a result queue among several processes in Python is not the simplest one. Checked yesterday, it works!