If you see a recurrence T(n) where one of the terms depends on the square root of the input size, it's often useful to define a new recurrence
S(k) = T(2k)
since the resulting recurrence in k is often a lot easier to work with than the corresponding recurrence in n. In our case, we have
S(k) = T(2k)
= 2T(√(2k)) + 1
= 2T(2k/2) + 1
= 2S(k / 2) + 1
Notice that we now have a recurrence that matches the form expected by the master theorem (and that generally is much easier to work with than the original one). We can solve this recurrence to S(k) = Θ(k) via a number of different techniques.
Since we know that S(k) = Θ(k), what can we say about T(n)? Well, assuming that we're okay fudging things that aren't exact powers of two, we can say that T(n) ≈ S(lg n), since S(lg n) = T(2lg n) = T(n). Therefore, we get that
T(n) = S(lg n) = Θ(lg n)
so T(n) = Θ(lg n).
Here's another way to arrive at this result that's less mathematical and more intuitive. Imagine the shape of the recursion tree that's formed by expanding the original recurrence T(n). At the top level, we have one call of size n doing 1 work. At the next level we have two calls of size 2√n for a total of 2 work. At the level below that are four calls of size 4√n for a total of 4 work. More generally, at level k, we have 2k calls each doing 2k√n work. The recursion terminates when n gets sufficiently small (say, n = 2), which happens at some level k. At that point, the total work done will have been
1 + 2 + 4 + 8 + 16 + ... + 2k = 2(2k) - 1.
If we can solve for k, we'll know how much total work is done. Notice that the recursion stops when the input drops to size 2, which means that the k we want is one where 2k√n = 2. Solving for k, we see that
2k√n = 2
n = 22k
lg lg n = k
In general, if you see something shrinking by a square root factor, you should expect that the number of iterations before the term drops to a constant is O(log log n). Therefore, the total work done is 2(2lg lg n) + 1 = 2 lg n - 1 = Θ(lg n), as before.