Why Is the Time Complexity of Python's List.Append() Method O(1)

Why is the time complexity of Python's list.append() method O(1)?

If you look at the footnote in the document you linked, you can see that they include a caveat:

These operations rely on the "Amortized" part of "Amortized Worst
Case". Individual actions may take surprisingly long, depending on the
history of the container.

Using amortized analysis, even if we have to occasionally perform expensive operations, we can get a lower bound on the 'average' cost of operations when you consider them as a sequence, instead of individually.

So, any individual operation could be very expensive - O(n) or O(n^2) or something even bigger - but since we know these operations are rare, we guarantee that a sequence of O(n) operations can be done in O(n) time.

Is a.insert(0,x) an o(n) function? Is a.append an O(1) function? Python

insert at the 0th index of a list requires shifting every other element along which makes it an O(N) operation. However, if you use a deque this operation is O(1).

append is an amortized O(1) operation since it simply requires adding the item on to the end of the list and no shifting is done. Sometimes the list needs to grow so it is not always an O(1) operation.

Time complexity of appending a list to a list

Referencing my comment again, "appending" a list to a list is still O(1), you are just adding the reference to the list, "extending" a list of size m is going to cost you O(m), since it has to traverse the other list of size m to append every single element.

seq = [1, 2, 3]
new_seq = [4, 5, 6]

# Appending is O(1)
seq.append(new_seq)
print(seq) # [1, 2, 3, [4, 5, 6]]

seq = [1, 2, 3]
new_seq = [4, 5, 6]

# Extending is O(m), m = len(new_seq)
seq.extend(new_seq)

print(seq) #[1, 2, 3, 4, 5, 6]

list_ = []
list_.append([_ for _ in range(0,n)])

You take O(n) time to create the list [_ for _ in range(0,n)] then O(1) time to add the reference of this list to your list_. O(n) in total for the line list_.append([_ for _ in range(0,n)]).

Why is the time complexity of Python's list.append() method O(1)?

If you look at the footnote in the document you linked, you can see that they include a caveat:

These operations rely on the "Amortized" part of "Amortized Worst
Case". Individual actions may take surprisingly long, depending on the
history of the container.

Using amortized analysis, even if we have to occasionally perform expensive operations, we can get a lower bound on the 'average' cost of operations when you consider them as a sequence, instead of individually.

So, any individual operation could be very expensive - O(n) or O(n^2) or something even bigger - but since we know these operations are rare, we guarantee that a sequence of O(n) operations can be done in O(n) time.

Time complexity for adding elements to list vs set in python

Both operations are O(1) amortized time-complexity.

Appending elements to a list has a lower coefficient because it does not need to hash the element first, nor does it need to check/handle hash collisions.

In the case of adding x into a set, Python needs to compute hash(x) first, because keeping the hash of all elements is what allows sets to have fast O(1) membership checks (compared to O(n) membership checks for lists).



Related Topics



Leave a reply



Submit