The other answers have done a good job of proposing alternate solutions to the problem at hand, so I'm going to look at the bigger question you asked, and direct you to the words of a smarter developer who wrote about eval's problems at length.
When I see eval
, a dark cloud descends upon the surrounding code, and I eye the whole mess with suspicion and mistrust until I’m satisfied that its use is justified.
[...]
eval
is a bad idea because nearly every time I have seen it used, it has caused unforeseen and unnecessary problems.
The important bits are “unforeseen” and “unnecessary”. Unforeseen because eval
has a huge pile of caveats associated with it, a list that I can’t even recall in its entirety without some thought. Unnecessary because the alternatives to eval
tend not to require much more work to implement, whereas the problems caused by eval
are subtle and nefarious.
[...]
eval
is bad because it introduces a lot of subtle security and translation issues, it defeats bytecode caching, it hides syntax and other errors until runtime, it causes action at a distance that’s hard to follow, it defeats syntax highlighting. It just makes your code worse.
You say you made sure not to expose eval()
to user input. Great! That's a good first step - but as the quote mentions, that's not at all the end of the list of things you have to think about with eval()
. What brings this answer and the others you've gotten together is that eval()
is a false economy. It is, at the very least, the incurring of technical debt. Like optimization, the two answers to "should I use eval()
?" are "you shouldn't" and "you shouldn't yet."