Take the following example, which is from an article by Bjarne Stroustrup that I learned about from Yakk a while back on these forums. Suppose I have a C++ vector and a C++ list. To each, I perform the following operation. First, I generate a bunch of random numbers, finding the place in the sequence that it should go in sorted order and inserting the number there. This has to be a linear scan for the list, so to make it fair I do a linear scan of the vector as well rather than a binary search. In the same order I generated the numbers, I then go and remove them from the list, again doing a linear scan to find the appropriate place and then removing.

In both cases, the "find the right place" is linear in the current size of the sequence. For the vector, the "insert/delete" operation is linear in the size of the sequence. For the list, the insert/delete operation is constant.

So, the question is: at around what number of elements do you expect that the additional time spent for the linear insert/delete in the vector will cause the vector to fall behind in performance relative to the list? Answer in the spoiler.

**Spoiler:**

(I should also say that I carried out this test myself, using a few different configurations. My results agree with Stroustrup's, and seem to be pretty robust to changes in the exact process, including changes that cause all nodes in the list to be allocated in sorted order.)