-2

What is the time complexity for multiplying two complex numbers? For example (35 + 12i) *(45 +23i)

chepner
  • 497,756
  • 71
  • 530
  • 681
Nik
  • 1
  • 2

3 Answers3

2

The asymptotic complexity is the same as for multiplying the components.

(35 + 12i) * (45 + 23i) == 35*45 + 45*12i + 35*23i - 12*23
                        == (35*45 - 12*23) + (45*12 + 35*23)i

You just have 4 real multiplications and 2 real additions.

So, if real multiplication is O(1), so is complex multiplication.

If real multiplication is not constant (as is the case for arbitrary precision values), then neither is complex multiplication.

chepner
  • 497,756
  • 71
  • 530
  • 681
  • is this O(n^2)? – Nik Oct 26 '17 at 23:14
  • No; no matter how large the values are, there are still just 4 multiplications. That won't increase with larger inputs. – chepner Oct 26 '17 at 23:15
  • If you want to add a source https://en.wikipedia.org/wiki/Multiplication_algorithm#Complex_multiplication_algorithm – hatchet - done with SOverflow Oct 26 '17 at 23:19
  • I don't believe that this is correct - real multiplication does not take time O(log n) on inputs of size n. Also, is there a reason you're talking about complex numbers? – templatetypedef Oct 26 '17 at 23:40
  • The OP edited the question; it used to be about complex numbers. I agree O(lg n) is not right, but the cold medicine is preventing me from giving a better answer. Let's just say it is ω(1) :) – chepner Oct 27 '17 at 00:04
1

If you multiply two complex numbers (a + bi) and (c + di), the calculation works out to (ac - bd, adi + bci), which requires a total of four multiplications and two subtractions. Additions and subtractions take less time than multiplications, so the main cost is the four multiplications done here. Since four is a constant, this doesn't change the big-O runtime of doing the muliplications compared to the real number case.

Let's imagine you have two numbers n1 and n2, each of which is d digits long. If you use the grade-school method for multiplying these numbers together, you'd do the following:

for each digit d1 of n2, in reverse:
    let carry = 0
    for each digit d2 of n1, in reverse:
        let product = d1 * d2 + carry
        write down product mod 10
        set carry = product / 10, rounding down

add up all d of the d-digit numbers you wrote in step 1

That first loop runs in time Θ(d2), since each digit in n2 is paired and multiplied with each digit of n1, doing O(1) work apiece. The result is d different d-digit numbers. Adding up those numbers will take time Θ(d2), since you have to scan each number of each digit exactly once. Overall, this takes time Θ(d2).

Notice that this runtime is a function of how many digits are in n1 and n2, rather than n1 and n2 themselves. The number of digits in a number n is Θ(log n), so this runtime is actually O((log max{n1, n2})2) if you're multiplying two numbers n1 and n2.

This is not the fastest way to do multiplications, though for a while there was a conjecture that it was. Karatsuba's algorithm runs in time O((log max{n1, n2})log3 4), where the exponent is around 1.7ish. There are more modern algorithms that run even faster of this, and it's an open problem whether it can be done in time O(log d) with no exponent!

templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065
0

Multiplying two complex numbers only requires three real multiplications.

Let p = a * c, q = b * d, and r = (a + b) * (c + d).

Then (a + bi) * (c + di) = (p - q) + i(r - p - q).

See also Complex numbers product using only three multiplications.

Dave Radcliffe
  • 644
  • 4
  • 11
  • In this case, one more real addition is required, isn't it? But it would take less time compared to the typical 4 real multiplications and 2 subtractions, as answered above? – actlee Feb 10 '21 at 11:39