-1

If we have std::set and std::vector which can grow and shrink dynamically, why do we need linked lists?

N.B. I really didn't understand, why there are so many down-votes. Down-voters, please leave comments.

user366312
  • 16,949
  • 65
  • 235
  • 452
  • 5
    C++ offers many [containers](http://www.cplusplus.com/reference/stl/) with various complexity trade-offs. So `std::set` can test membership in logarithmic time, etc... (and you could implement your own more sophisticated data structures). Linked lists have constant time append operations. – Basile Starynkevitch Jun 27 '15 at 21:47
  • 1
    Top rated answer in this question https://stackoverflow.com/questions/393556/when-to-use-a-linked-list-over-an-array-array-list should answer this. – Saul Jun 29 '15 at 11:22
  • I wouold say that the downvotes are comming from the fact that this question has been asked and answered many times. See: [In which scenario do I use a particular STL Container?](http://stackoverflow.com/questions/471432/in-which-scenario-do-i-use-a-particular-stl-container) – NathanOliver Jun 30 '15 at 16:34
  • This answer is simple and satisfactory, I think: http://stackoverflow.com/a/393571/3025112 – omerfarukdogan Jul 04 '15 at 15:56

5 Answers5

15

why do we need linked lists?

Linked list is O(1) insert, removal, concatentation, ... etc. List grows big? so what, system keeps running normally. Just plan to not need random access.

A vector grows in (probably) binary chunks, so the time hit grows to something that is a fairly big, and fairly quickly. It is an admittedly rare time hit.

In the following output from older Dell, g++ 4.9.2 on Ubuntu-15.04, the last column is milliseconds time since the previous capacity change. (Last column hastily added to re-use the example for you ;)

     element         vector     bytes stack use         bytes heap use
      count       capacity    'sizeof(vector)'     (element count * element size)
          0              0                24                       0         0
          1              1                24                       4         0
          2              2                24                       8         0
          3              4                24                      12         0
          5              8                24                      20         0
          9             16                24                      36         0
         17             32                24                      68         0
         33             64                24                     132         0
         65            128                24                     260         1
        129            256                24                     516         0
        257            512                24                   1,028         0
        513          1,024                24                   2,052         0
      1,025          2,048                24                   4,100         0
      2,049          4,096                24                   8,196         0
      4,097          8,192                24                  16,388         1
      8,193         16,384                24                  32,772         2
     16,385         32,768                24                  65,540         3
     32,769         65,536                24                 131,076         8
     65,537        131,072                24                 262,148        12
    131,073        262,144                24                 524,292        28
    262,145        524,288                24               1,048,580        50
    524,289      1,048,576                24               2,097,156        38
  1,048,577      2,097,152                24               4,194,308        76
  2,097,153      4,194,304                24               8,388,612       149
  4,194,305      8,388,608                24              16,777,220       300
  8,388,609     16,777,216                24              33,554,436       601
 16,777,217     33,554,432                24              67,108,868      1215
 33,554,433     67,108,864                24             134,217,732      2475
 67,108,865    134,217,728                24             268,435,460      4982
134,217,729    268,435,456                24             536,870,916     10247
268,435,457    536,870,912                24           1,073,741,828     21387

Is this enough justification? I'm not sure.

I use vectors for my lists (which almost never grow to Gigabyte size, and I add the use of reserve when I find out the max size), for my fifo's (processing files and dirs created millions of strings), and generally ignore the list, because how I use vectors is fast enough (oops, can't believe I just said that out loud).

The last entry in this table is when the vector capacity was bumped to 512 K elements (from 512 K - 1). The allocation, and data copy, and erase (with noop dtors of the small objects) but with NO more push_back()s, resulted in 1 G bytes on the stack, and took 21 seconds.

(I think I no longer wish to have 8 Gig on this old machine.)


edit - 7/1/2015

I reran this code to regenerate the last 2 lines of the table and capture wall clock info.

134,217,729    268,435,456                24             536,870,916      9769
268,435,457    536,870,912                24           1,073,741,828     20983

real 0m41.147s
user 0m39.928s
sys  0m1.164s

The wall clock time reports 41 seconds.

It took 21 seconds to produce the last output line (measured by realtime clock in ms), and

20 seconds to produce the 1st 29 lines ... isn't that just exponential!


edit - 7/3 2015 - thought some comments about set vs linked-list are needed for the OP.

If we have std::set [and ...] which can grow and shrink dynamically, why do we need linked lists?

From cppreference.com:

std::set is an associative container that contains a sorted set of unique objects of type Key. Sorting is done using the key comparison function Compare. Search, removal, and insertion operations have logarithmic complexity. Sets are usually implemented as red-black trees.

On the one hand, a set (which I have rarely used) is somewhat like a linked list in that there would be nodes (when implemented as a red-black-tree), and each node is allocated independently. When the set grows big, the system keeps running relatively normally (consistently?) (at least until the nodes wade into the swap (I said swap, not swamp).

IMHO, a set (rbtree) has no 'work backlog'. Instead, the tree balances on a per-insert/append/delete effort. (I've not been able to tease some evidence of such with one small effort ... sorry, no table to look at.)

When you review red-black-tree you will find that the rbtree contains no data, just the key (and a color).

I can imagine a linked list with only the key and color in the data (like the set). Software is remarkably flexible. But my imagination now fails me.

I very much appreciate the idea of an associative container. I found std::map to be remarkably useful, and it has affected how I think about several kinds of software challenges.

But, I have not investigated how the map accomplished its magic. More to study.

2785528
  • 5,438
  • 2
  • 18
  • 20
8

Besides the constant insert/remove/splice time, linked lists provide another useful property - iterators are not invalidated when the list is altered (except of course iterators that point to deleted elements. Thus, if you have a linked list, you can keep an iterator around as a pointer into the list.

Though you can get a similar effect with a vector by storing the index of an element, using a list and storing an iterator has the advantage that, no matter what happens to the list, as long as the element still exists, that iterator will always point to the same element. If you used a vector, you'd need to fix up the index if an element earlier in the vector was removed.

celticminstrel
  • 1,637
  • 13
  • 21
6

Although all the answers given as yet are correct, let me try to explain it in words suitable for a C++ beginner.

The advantage of a std::vector (or an array in general) is that you can access any of its elements via an index in a relatively short time. One of the weaknesses however is that it is relatively costly to insert an element at a position near the middle of beginning. In that case more storage needs to be allocated and all the elements following the inserted one have to be copied to their new position. In the worst case, the allocation of additional space is not possible and the entire array has to be copied to a new location.

In contrast, accessing the 1000th element of a list is relatively slow, as we have to follow 999 links to the next element of the list. However, inserting a new element at that position is relatively easy and fast. We just have to allocate space for the new element and modify a couple of pointers.

You see, vectors have their specific strengths and so do lists.

nv3
  • 400
  • 4
  • 9
5

Because sometimes we need O(1) Insert/Erase performance for some certain situation. We can't achieve the goal with set or vector.

+========+=========+=========+=========+===============+
|        | Insert  | Erase   | Lookup  | Iterator      |
+========+=========+=========+=========+===============+
| vector | O(N)    | O(N)    | O(1)    | Random Access |
+--------+---------+---------+---------+---------------+
| set    | O(logN) | O(logN) | O(logN) | Sequential    |
+--------+---------+---------+---------+---------------+
| list   | O(1)    | O(1)    | O(N)    | Sequential    |
+--------+---------+---------+---------+---------------+
Kazuki Sakamoto
  • 13,929
  • 2
  • 34
  • 96
4

It's neither - it's a fixed-size array annotated with the length, to avoid buffer-overflows.

It does have a way to resize it, but the Resize method destructively wipes the internal array, which is a bit odd. It should allocate the new destination, copy the members over, then deallocate the original array.

Dai
  • 141,631
  • 28
  • 261
  • 374