-1

So, code as shown below. I've been implementing a very simple HashMap, HashEntry class is also defined simply.

Now I'm not super experienced with C++, but lets say new, delete and malloc/free/realloc/etc all cause my system to crash and the code needs to be adapted. I'm really not sure how to handle the constructor and deconstructor without those tools.

I know with some things, a string for example I can initialize like this: string program(sizeVariable), but I don't really see how to do that with this line table = new HashEntry*[TABLE_SIZE]; And then also lost as to handle the loss of delete in the deconstruction.

Any answers or advice appreciated, also forgive the formatting it doesn't look that ugly in my IDE.

class HashMap {
    private:
       HashEntry **table;
    public:
       HashMap() 
       {
            table = new HashEntry*[TABLE_SIZE];
            for (int i = 0; i < TABLE_SIZE; i++)
                table[i] = NULL;
       }
  ...
       ~HashMap()
       {
            for (int i = 0; i < tableSize; i++)
                if (table[i] != NULL)
                    delete table[i];
            delete[] table;
       }

}
Patrick Johnston
  • 101
  • 1
  • 1
  • 12
  • 1
    Related, why you're not using a `std::vector` of smart pointers and decent exception handling is worth pondering. – WhozCraig Dec 18 '17 at 19:19
  • 1
    The posted code looks ok to me. However, you should check out [The Rule of Three](https://stackoverflow.com/questions/4172722/what-is-the-rule-of-three). – R Sahu Dec 18 '17 at 19:21
  • @RSahu: how can that look good to you? Manual memory management in C++ is a code smell, especially for something as simple as a 2D array. Also, the rule of zero is the recommendation. If you are doing manual memory management, you need to start with the rule of five, not the rule of three. – Vittorio Romeo Dec 18 '17 at 19:22
  • 1
    @VittorioRomeo, good not in programming practice sense but good in correctness sense. – R Sahu Dec 18 '17 at 19:22
  • 1
    OT: delete handles nullptr correctly, you don't need to implement a test yourself. –  Dec 18 '17 at 19:29
  • Also you should just value initialize on allocation instead of having the loop set to null – M.M Dec 18 '17 at 19:35
  • This is a theoretical restriction, those commands don’t actually crash my compiler or anything. There are also other odd specifications to this exercise that I haven’t listed that dictate why I wrote it the way I did, but the feedback is good nonetheless. – Patrick Johnston Dec 18 '17 at 19:54
  • 1
    For that *specific* code, if this is some sort of static exercise, then that seemingly fixed hashmap could simply be non-dynamic. I.e. a member variable, `HashEntry* table[TABLE_SIZE];` would eliminate the allocation and deallocation. You would still need to perform the null fill and proper loop cleanup, however. And the `delete table[i];` would seem unavoidable without a static pool of `HashEntry` from which to reap (such as linear probing and delta for collision bleed-over). – WhozCraig Dec 18 '17 at 20:55

1 Answers1

1

new, delete and malloc/free/realloc/etc all

You should almost never use any of those in C++. Manual memory management is pretty much dead for day-to-day C++ development. There are many better alternatives:

  • Smart pointers such as unique_ptr and shared_ptr.

  • Containers such as std::vector or std::array.

You need to read a good Modern C++ introductory book. Start with "A Tour of C++".


If you are doing manual resource management (...why?), you also want to follow the rule of five.

Vittorio Romeo
  • 90,666
  • 33
  • 258
  • 416