Using an Iterator to Divide an Array into Parts with Unequal Size

Using an iterator to Divide an Array into Parts with Unequal Size

The reason this is prohibited is covered well at your other question Are iterators past the "one past-the-end" iterator undefined behavior? so I'll just address improved solutions.

For random-access iterators (which you must have if you are using <), there's no need whatsoever for the expensive modulo operation.

The salient points are that:

  • it + stride fails when it nears the end
  • end() - stride fails if the container contains too few elements
  • end() - it is always legal

From there, it's simple algebraic manipulation to change it + stride < end() into a legal form (subtract it from both sides).

The final result, which I have used many times:

for( auto it = c.cbegin(), end = c.cend(); end - it >= stride; it += stride )

The compiler is free to optimize that back to comparison to a precomputed end - stride * sizeof(*it) if the memory model is flat -- the limitations of C++ behavior don't apply to the primitive operations which the compiler translates C++ into.

You may of course use std::distance(it, end) if you prefer to use the named functions instead of operators, but that will only be efficient for random-access iterators.

For use with forward iterators, you should use something that combines the increment and termination conditions like

struct less_preferred { size_t value; less_preferred(size_t v) : value(v){} };

template<typename Iterator>
bool try_advance( Iterator& it, less_preferred step, Iterator end )
{
while (step.value--) {
if (it == end) return false;
++it;
}
return true;
}

With this additional overload, you'll get efficient behavior for random-access iterators:

template<typename RandomIterator>
auto try_advance( RandomIterator& it, size_t stride, RandomIterator end )
-> decltype(end - it < stride) // SFINAE
{
if (end - it < stride) return false;
it += stride;
return true;
}

How do i chunk an array with different sizes

Using a while loop here, and adding an extra parameter called say chunksizes, you can then modulus the chunksize on each iteration.

I've also altered to using while loop, as I think it fits nicer here, rather than using a for loop.

eg.

const test = [1,2,3,4,5,6,7,8,9,10,11,12,13];
const chunk = (cards, chunksizes) => { const chunkArray = []; let cc = 0, i = 0; while (i < cards.length) { const csize = chunksizes[cc]; chunkArray.push(cards.slice(i, i + csize)); cc = (cc + 1) % chunksizes.length; i += csize; } return chunkArray;}
console.log(chunk(test, [4,1]));

How do I split a list into equally-sized chunks?

Here's a generator that yields evenly-sized chunks:

def chunks(lst, n):
"""Yield successive n-sized chunks from lst."""
for i in range(0, len(lst), n):
yield lst[i:i + n]
import pprint
pprint.pprint(list(chunks(range(10, 75), 10)))
[[10, 11, 12, 13, 14, 15, 16, 17, 18, 19],
[20, 21, 22, 23, 24, 25, 26, 27, 28, 29],
[30, 31, 32, 33, 34, 35, 36, 37, 38, 39],
[40, 41, 42, 43, 44, 45, 46, 47, 48, 49],
[50, 51, 52, 53, 54, 55, 56, 57, 58, 59],
[60, 61, 62, 63, 64, 65, 66, 67, 68, 69],
[70, 71, 72, 73, 74]]

For Python 2, using xrange instead of range:

def chunks(lst, n):
"""Yield successive n-sized chunks from lst."""
for i in xrange(0, len(lst), n):
yield lst[i:i + n]

Below is a list comprehension one-liner. The method above is preferable, though, since using named functions makes code easier to understand. For Python 3:

[lst[i:i + n] for i in range(0, len(lst), n)]

For Python 2:

[lst[i:i + n] for i in xrange(0, len(lst), n)]

How to divide python list into sublists of unequal length?

Yet another solution

list1 = [1,2,1]
list2 = ["1.1.1.1","1.1.1.2","1.1.1.3","1.1.1.4"]

chunks = []
count = 0
for size in list1:
chunks.append([list2[i+count] for i in range(size)])
count += size
print(chunks)

# [['1.1.1.1'], ['1.1.1.2', '1.1.1.3'], ['1.1.1.4']]

Splitting a list into uneven groups?

This solution keeps track of how many items you've written. It will crash if the sum of the numbers in the second_list is longer than mylist

total = 0
listChunks = []
for j in range(len(second_list)):
chunk_mylist = mylist[total:total+second_list[j]]
listChunks.append(chunk_mylist)
total += second_list[j]

After running this, listChunks is a list containing sublists with the lengths found in second_list.

how to split an iterable in constant-size chunks

This is probably more efficient (faster)

def batch(iterable, n=1):
l = len(iterable)
for ndx in range(0, l, n):
yield iterable[ndx:min(ndx + n, l)]

for x in batch(range(0, 10), 3):
print x

Example using list

data = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10] # list of data 

for x in batch(data, 3):
print(x)

# Output

[0, 1, 2]
[3, 4, 5]
[6, 7, 8]
[9, 10]

It avoids building new lists.

python split array into sub arrays of equivalent rank

FOR X SUBLISTS :

One possibility would be to do :

def get_sublists(original_list, number_of_sub_list_wanted):
sublists = list()
for sub_list_count in range(number_of_sub_list_wanted):
sublists.append(original_list[sub_list_count::number_of_sub_list_wanted])
return sublists

You can then unpack the sub-lists stored in sublist.

For example :

a = [5,4,3,2,1,0]
x1, x2 = get_sublists(a, 2)

will grant you the expected output.

This is the trivial solution. Their is probably something more pythonic in itertools or an other lib.

If you don't understand how this code works , take a look at the documentation of a slice.



Related Topics



Leave a reply



Submit