Resetting Generator Object in Python

Resetting generator object in Python

Another option is to use the itertools.tee() function to create a second version of your generator:

import itertools
y = FunctionWithYield()
y, y_backup = itertools.tee(y)
for x in y:
print(x)
for x in y_backup:
print(x)

This could be beneficial from memory usage point of view if the original iteration might not process all the items.

Can iterators be reset in Python?

I see many answers suggesting itertools.tee, but that's ignoring one crucial warning in the docs for it:

This itertool may require significant
auxiliary storage (depending on how
much temporary data needs to be
stored). In general, if one iterator
uses most or all of the data before
another iterator starts, it is faster
to use list() instead of tee().

Basically, tee is designed for those situation where two (or more) clones of one iterator, while "getting out of sync" with each other, don't do so by much -- rather, they say in the same "vicinity" (a few items behind or ahead of each other). Not suitable for the OP's problem of "redo from the start".

L = list(DictReader(...)) on the other hand is perfectly suitable, as long as the list of dicts can fit comfortably in memory. A new "iterator from the start" (very lightweight and low-overhead) can be made at any time with iter(L), and used in part or in whole without affecting new or existing ones; other access patterns are also easily available.

As several answers rightly remarked, in the specific case of csv you can also .seek(0) the underlying file object (a rather special case). I'm not sure that's documented and guaranteed, though it does currently work; it would probably be worth considering only for truly huge csv files, in which the list I recommmend as the general approach would have too large a memory footprint.

How to make a repeating generator in Python

Not directly. Part of the flexibility that allows generators to be used for implementing co-routines, resource management, etc, is that they are always one-shot. Once run, a generator cannot be re-run. You would have to create a new generator object.

However, you can create your own class which overrides __iter__(). It will act like a reusable generator:

def multigen(gen_func):
class _multigen(object):
def __init__(self, *args, **kwargs):
self.__args = args
self.__kwargs = kwargs
def __iter__(self):
return gen_func(*self.__args, **self.__kwargs)
return _multigen

@multigen
def myxrange(n):
i = 0
while i < n:
yield i
i += 1
m = myxrange(5)
print list(m)
print list(m)

Generator is empty after initial use

Problem is you are trying to reuse

Generator object cannot be reused

# Problem is you are trying to reuse
# Generator object cannot be reused
Aircraft_typelst = ['Boeing','Airbus','MiJ','goes']
upper_case_name = ( name.upper() for name in Aircraft_typelst )
print(list(upper_case_name))
# Instead of it you can use it as
reverse = (rev[::-1 ]for rev in ( name.upper() for name in Aircraft_typelst) )
print(list(reverse))

Output

['BOEING', 'AIRBUS', 'MIJ', 'GOES']
['GNIEOB', 'SUBRIA', 'JIM', 'SEOG']

Prevent Tensosflow Dataset from resetting the generator on multiple model.predict calls

I found the answer. Basically, we can store the corresponding generator in a variable and then use lambda to make it callable. That doesn't reset the generator.

cur_gen = gen_predict(img_no)
dataset_pred = tf.data.Dataset.from_generator(lambda: cur_gen, ((tf.float32, tf.float32),), output_shapes=((tf.TensorShape([23,23,23]), tf.TensorShape([6,1])),))
dataset_pred = dataset_pred.batch(BS)
for i in range(num_batches):
temp_pred = np.array(model.predict(dataset_pred, batch_size=BS, steps=1))
## aggregate the temp_pred result ##

Python: Why does this generator not continue from yield?

You need to reuse the generator instead of creating a new one each time.

primes = prime_generator()
print(next(primes))
print(next(primes))


Related Topics



Leave a reply



Submit