I'm not sure what's the scale of the difference between using bitwise operations and and using simple operators on today's architectures and compilers (my guess is... close to none on most problems).
I'm basing my hunch on the fact that most code you write nowadays doesn't really bottleneck on CPU operations, but rather on database access, I/O, network (yes, I'm being redundant here).
The compiler for the JVM, for instance, already optimizes a lot of operations to the architecture where you're running your code, abstracting this necessity away from the developer (that's what it's there for).
One final point I would like to make is... readability counts. No matter what some bitwise operation fans will argue, bitwise operation in the middle of your code are usually much less readable by most developers then simply using standard math.
The cost of understanding the code and the increased probability that someone will introduce a bug when the code needs to be changed makes, IMHO, the risk far surpass the benefits. It's important to write efficient code, but it still has to be readable by humans.
Disclaimer: There might be some domains where the number of mathmatical operations is such that this might become a factor, but this certainly is not the norm.