Assume I want to add up either arabic numbers (1+2) or roman numbers (I+II) and I use an interpreter pattern that looks something like this:
(code derived from here: https://en.wikibooks.org/wiki/C%2B%2B_Programming/Code/Design_Patterns#Interpreter)
struct Expression {
virtual int interpret() = 0;
};
class ArabicNumber : public Expression {
private:
int number;
public:
ArabicNumber(int number) { this->number = number; }
int interpret(Map variables) { return number; }
}
class RomanNumber : public Expression {
private:
string number;
public:
RomanNumber(string number) { this->number = number; }
int interpret(Map variables) {
//somehow convert the roman number string to an int
}
}
class Plus : public Expression {
Expression* leftOperand;
Expression* rightOperand;
public:
Plus(Expression* left, Expression* right) {
leftOperand = left;
rightOperand = right;
}
~Plus(){
delete leftOperand;
delete rightOperand;
}
int interpret(Map variables) {
return leftOperand->interpret(variables) + rightOperand->interpret(variables);
}
};
How do I ensure that the erroneous query (1+II) is handled properly? The only solution I could think of was to somehow use casting but that doesn't sound like an elegant solution. Or should the pattern not be used that way?
Of course, one option would be to write two separate functions for this, but I'm curious if it could be done in one because I would like to use this pattern for a more complex context free grammar.
Edit: My problem is also described here. I quote the relevant section:
However, introducing a language and its accompanying grammar also requires fairly extensive error checking for misspelled terms or misplaced grammatical elements.
So my main question: How to best design that extensive error checking?