1

This code does not cause an exception:

int *a=new int[100];  
try{
  a[101]=1;
}
catch(...){
//never called
}

However, if the index is greater (a[11111111]=1), the error occurs.

Is it possible establish more stringent rules for handling errors? (MSDN VC++)

Thank you!

pinkfloydx33
  • 11,863
  • 3
  • 46
  • 63

4 Answers4

4

Unfortunately C++ is designed around the idea that programmers make no mistakes, ever. Don't laugh... I'm serious.

In C++ there are no "runtime error angels" that take care the code is not doing anything stupid like accessing an array element outside bounds or deallocating the same pointer twice. The main assumption is that programmers would never write code that does that and therefore wasting CPU time checking for example array bounds on access is not a good idea.

The are exceptions thrown by the standard library, but in cases where it's impossible for a programmer know if the operation is possible or not (like trying to allocate more memory than it's available).

In other cases the assumption is that the code is correct and no checking is necessary. As a mere facility there are however a few methods of the standard library that use the exception machinery to report failure like std::vector::at or dynamic::cast.

Instead of "run-time error angels" C++ has "undefined behavior daemons" (a.k.a. "nasal daemons") that instead of pointing you to the problem can do anything they want if you every violate any of the rules. Being malignant daemons the most dangerous behavior they love to apply is to just keep silent in the dark and letting the program run anyway providing reasonable results... unless you're running the program in front of a vast audience that includes your parents and your boss (in that case they will make the screen look whatever is most embarrassing for you).

The C++ dream is that moving all checks of validity at compile-time will save programs from run-time errors. Despite being an obvious delusion this is what the language was designed around; the only way to program in C++ is to avoid all errors (target not so easy also because the language is incredibly complex).

To add insult to injury Microsoft decided that when a program does something so bad even the operating system itself can see it's an obvious error the report machinery should still be based on exception throwing. This allows writing programs that won't quit with a crash if they access memory that is not even within their address space. The ability to catch and even swallow (!) these errors is something should never ever be used, unless your definition of a "robust program" is "a program that is hard to kill".

This is what your catch(...) is doing...

6502
  • 112,025
  • 15
  • 165
  • 265
  • writing correct, standard-compliant code in c++ catches almost all logic errors at compile time. This is hardly "unfortunate", it's infinitely superior to runtime-checked languages like java. it means your bugs are found by you, not your customers. – Richard Hodges Nov 01 '15 at 11:18
  • @RichardHodges: Code that uses index access can also perform out-of bounds access. Using iterators is not better as the language is also very annoying on iterator invalidation. Java is **both** compile-time checked and run-time checked and I don't understand why you're making a dichotomy. Compile time checks are good, but also run time checks are good (except in performance-ciritical parts in which there should be a way to turn them off). Java is really not my language... but for other reasons; not for sure because it has **also** run-time checks. – 6502 Nov 01 '15 at 12:46
  • I guess it must be a philosophy difference then. I've never run a java program that didn't at some point report a null-pointer exception and die. perhaps I've just been unlucky :-) – Richard Hodges Nov 01 '15 at 12:54
  • @RichardHodges: at least it was a null-pointer exception. With C++ you instead get random numbers instead of invoice totals (when you're not lucky enough to get a segfault). – 6502 Nov 01 '15 at 13:00
  • that's why we never use raw pointers (or even smart pointers) without wrapping them in a little handle class with a controlled constructor though right? null pointers then become impossible at compile time. This is fundamental c++ code philosophy isn't it? – Richard Hodges Nov 01 '15 at 13:14
  • @6502: In C++, if we are talking about well-written code, there would likely not have been a pointer in the first place, but a reference. The omnipresence of `NullPointerException` in Java is caused by the language's lack of (real) references. – Christian Hackl Nov 01 '15 at 14:47
3

You seem to have two misconceptions about exceptions:

  1. C++ throws exceptions for all errors, like Java.
  2. C++ should be written by the programmer such that exceptions are thrown for all errors.

Both are indeed misconceptions. In C++, exceptions are rarely thrown by standard-library components or by the language itself, and they should be thrown only in situations where a problem is rare but out of your control, or where there is no easy other way to report an error.

An example for the former would be allocating more memory than the system is able to give you. This is a relatively rare problem with an external resource which you cannot prevent. Therefore, new will indeed throw a std::bad_alloc exception if you are running out of memory.

Examples for the latter are constructors and dynamic_casts with references.

Accessing an illegal index in a dynamically allocated array is not a good use case for exceptions, because you can easily prevent that error. If anything, it should be an assertion, because if you access an illegal index, then you have a programming error, i.e. a bug. Bugs cannot be fixed at run-time.

The usual C++ way of dealing with such low-level errors is undefined behaviour. In your example:

int *a=new int[100];  
try{
  a[101]=1;
}
catch(...){
//never called
}

The attempt to access a[101] invokes undefined behaviour, which means that the program is allowed to do everything. It may crash, it may work fine, it may even throw an exception. It's all up to your compiler and your environment.

Living with undefined behaviour is not a very nice thing, of course, so it's understandable that you want to do something about it.

First all, don't ever use new[]. Use std::vector instead. Illegal element access with std::vector is still undefined behaviour, but your compiler may introduce extra run-time checks such that the undefined behaviour results in an immediate crash, such that you can fix the bug.

std::vector does also have an at member function which throws exceptions on wrong indices, but that's more of a design error than anything, because at the std::vector level, a wrong index indicates a problem at higher abstraction level in your program logic.


The important thing is that you give up on the idea that you can "handle" all errors equally. Split your errors into three different categories:

  1. Bugs. A bug means that your code is wrong and that your program is not what you think it is. Use assert for those errors and/or rely on your compiler to introduce corresponding checks in C++ standard-library components. Wrong code should crash fast such that it can do as little harm as possible. Hint: MSVC has something called "debug versions". A "debug version" is a set of compiler and linker options which enables a lot of extra run-time checks to help you at finding the bugs in your code.

  2. Wrong input. All programs receive wrong input. You must never assume that input is correct. It does not matter if the input comes from a human being or from a machine. Wrong input must be part of normal program flow. Don't use assert or exceptions for wrong input.

  3. Exceptional status of external resource. This is what exceptions should be used for. Every program relies on external resources in one form or the other, usually provided by the operating system. The major external resource is memory. Something like std::vector<int> x(100); normally does not fail, but in theory, it could, because it requires memory. Starting a new thread normally does not fail, but in theory, the operating system may not be able to start one. Those are exceptional situations, so exceptions are a good way to handle them.

These are rough guidelines, of course. It is especially hard to draw an exact line between wrong input and problems with external resources.

Still, here is an example that attempts to summarise the guidelines:

#include <iostream>
#include <vector>
#include <cstdlib>
#include <cassert>

void printElement(std::vector<int> const& v, int index)
{
    // ----------------
    // Error category 1
    // ----------------
    assert(index >= 0);
    assert(index < static_cast<int>(v.size()));
    std::cout << v[index] << "\n";
}

int main()
{
    std::cout << "Enter size (10-100): ";
    int size = 0;
    std::cin >> size;
    if (!std::cin || (size < 10) || (size > 100))
    {
        // ----------------
        // Error category 2
        // ----------------
        std::cerr << "Wrong input\n";
        return EXIT_FAILURE;
    }

    try
    {
        std::vector<int> v(size);
        printElement(v, 0);
    }
    catch (std::bad_alloc const&)
    {
        // ----------------
        // Error category 3
        // ----------------
        std::cerr << "Out of memory\n";
        return EXIT_FAILURE;
    }
}
Christian Hackl
  • 27,051
  • 3
  • 32
  • 62
  • Thank you very much!!! But how about performance? I'm working with huge image buffers like for(int i=0;i<5000*5000*4;i++){buf[i]=r1*a+(1-a)*r2;} If i replace buf[i] to buf.at(i), how strong is the loss in performance? – user3226859 Nov 06 '15 at 09:19
  • 1
    @user3226859: Performance usually cannot be predicted in any serious way. C++ as a programming language does not dictate how fast things run. You must measure both versions with your real data. Generally, it's a myth that exceptions cause performance problems. The major problem with `at` is **not** performance but program logic. I strongly recommend `[]`. – Christian Hackl Nov 06 '15 at 11:56
  • @user3226859: P.S.: It's also important not to compare apples and oranges. If you need exceptions for your program logic (which is not the case for an illegal vector index), then without exceptions you'd need an entirely different alternative substitute approach which brings its own share of possible performance problems (or not -- you must always measure). – Christian Hackl Nov 06 '15 at 12:00
1

You should use std::vector and its method at(int index) which guarantees array-bounds checking:

#include <iostream> 
#include <vector>

int main()
{
    std::vector<int> a(100);
    try{
        int index = 101;
        a.at(index) = 1;
        std::cout << a.at(index) << std::endl;
    }
    catch (const std::out_of_range& ex) {
        std::cerr << "Out of Range error: " << ex.what() << std::endl;
    }
    return 0;
}
maxteneff
  • 1,523
  • 12
  • 28
1

one answer to your question is that C++ will not stop you from writing harmful code. Your code is writing into a memory space that you don't know anything about. When you set the index very high, you probably write to a location that your application is not allowed to touch, and this is why it fails.

In MS Visual C++ you can set different warning levels, some of these warning levels will warn you about unsafe code, so you can fix it/make it safer before you ship it.

You can also try other tools that will check your code. (I have used: http://cppcheck.sourceforge.net/)

Thor
  • 21
  • 3