diff --git a/doc/index.rst b/doc/index.rst index f759833..e1ddb79 100644 --- a/doc/index.rst +++ b/doc/index.rst @@ -292,7 +292,7 @@ An instantiated pool object is an iterable derived from :class:`Stream` and represents the output values. The returned iterator behaves as follow: their :func:`next` calls return as soon as a next output value is available, or raise :exc:`StopIteration` if there is no more output. A pool object can also be -futher piped. +further piped. If an input `value` causes an :exc:`Exception` to be raised in the worker thread/process, the tuple `(value, exception)` is put into the pool's diff --git a/stream.py b/stream.py index 76d04c9..0f8a379 100644 --- a/stream.py +++ b/stream.py @@ -48,12 +48,12 @@ If the order of processing does not matter, an ThreadPool or ProcessPool can be used. They both utilize a number of workers in other theads or processes to work on items pulled from the input stream. Their output -are simply iterables respresented by the pool objects which can be used in +are simply iterables represented by the pool objects which can be used in pipelines. Alternatively, an Executor can perform fine-grained, concurrent job control over a thread/process pool. Multiple streams can be piped to a single PCollector or QCollector, which -will gather generated items whenever they are avaiable. PCollectors +will gather generated items whenever they are available. PCollectors can collect from ForkedFeeder's or ProcessPool's (via system pipes) and QCollector's can collect from ThreadedFeeder's and ThreadPool's (via queues). PSorter and QSorter are also collectors, but given multiples sorted input @@ -201,7 +201,7 @@ def __repr__(self): class take(Stream): - """Take the firts n items of the input stream, return a Stream. + """Take the first n items of the input stream, return a Stream. >>> seq(1, 2) >> take(10) Stream([1, 3, 5, 7, 9, 11, 13, 15, 17, 19])