-1

Suppose we have the following problem along with a java method that can help.

A user enters passwords between 5 characters and 35 characters long inclusive. We need to make sure they don't repeat the same char 3 times in a row.

boolean has3CharsInARow(char [] password){
  for(int left=0, middle=1, right=2; right < password.length; left++, middle++, right++){
    if(password[left]==password[middle] && password[middle]==password[right]){
      return true;
    }
  }
  return false;
}

What is the time complexity (For simplicity assume Big O notation and worst case scenario)?

My thoughts are we don't know where the 3 chars occur in the password so we have to search all appropriate chars to be sure. But I am struggling to classify it as O(1) or O(n) for n chars.

What is the argument for O(1)? Well given the context, we know there are constraints on the password it can at most be 35 characters long. So in the worst case we don't find 3 repeating chars we scan O(34) 33 for right indices 2 through 34 and 1 more for when right is 35 and then we exit the guard of the loop and finally return false. Since 34 is a constant, in general we say O(34) = O(1) which is constant time complexity.

What is the argument for O(n)? Well we care about how the time efficiency of the function behaves as it's input length grows. If we suppose that T(n) is the running time of has3CharsInARow, we can say T grows linearly proportional for every unit or char increase in the password length. So T is in the class of functions O(n).

Where do we draw the line between O(1) and O(n)?

Edit: Since one answerer has wrote O(n) then does that mean this equivalent method is O(n) too?

boolean has3CharsInARow(char [] password){
  switch(password.length){
    case 0:
    case 1:
    case 2:
        return false;
    case 3: return password[0] == password[1] && password[1] == password[2];
    case 4: return password[0] == password[1] && password[1] == password[2] ||
            password[1] == password[2] && password[2] == password[3];
    ...
    case 35: return ...;

  }
}
fgharo91
  • 225
  • 1
  • 3
  • 11
  • 1
    Possible duplicate of [Computational complexity for the case of many answers or multiple parameters](https://stackoverflow.com/questions/39420745/computational-complexity-for-the-case-of-many-answers-or-multiple-parameters) – ivan_pozdeev Jan 13 '19 at 02:24
  • If you change the maximum length of the password or the maximum number of consecutive characters to check, what will be the impact on the performance? Say another company limit their password to 15 characters and you want to validate 1000 passwords, what would be the impact on the duration of check in the worst case (all passwords are of max length)? **Obviously, if you do time analysis, you want to know the impact on the time from something that could varies** – Phil1970 Jan 13 '19 at 02:28
  • An O(1) operation could be something like how much time it takes to modify character at index i in a string as it would essentially take the same time if the string is 1 character long or 1000 characters long. If you want to count the number of `a` in a string then the time would be proportional to the length of the string. Easy to understand. – Phil1970 Jan 13 '19 at 02:35

1 Answers1

1

The time complexity of the algorithm is O(n). There is really no wiggle room here. This is the mathematical and the algorithm analysis. For completeness, best case scenario is O(1), avg case and worst case scenario is O(n).

I think the confusion comes from people not understanding what big O notation means and how to interpret it. You say (I'm paraphrasing): "but if I limit the size of the input to a constant, then the complexity is really a constant, no?" And the answer is: no. The time complexity is a "description" of how the time execution of an algorithm grows as the input grows. This "description" is still true even if the range of the input is [5, 35] or [5, Integer.MAX_VALUE] or [5, ∞). It is a (co)relation between runtime and input size.

Time complexity doesn't tell you what time it will take your alg to run. It tells you how big is the impact of changing input size on the runtime.


Let's see how time complexity can help you in your case:

The time complexity is linear. For such a small input size you can draw a reasonable conclusion that the algorithm is ok and you don't have to worry too much about it.

But, if the time complexity of your algorithm for example would be something like O(2^n) then that would tell you that the runtime would just explode at potentially a small input size and you would actually have to see if size 35 is still acceptable.

bolov
  • 72,283
  • 15
  • 145
  • 224
  • I edited my question. Would you be willing to say the equivalent method also runs in O(n)? – fgharo91 Jan 13 '19 at 02:20
  • @fgharo91 yes, the number of operations grows linearly with the size of input. The complexity is still `O(n)`. – bolov Jan 13 '19 at 02:21
  • The switch takes constant time to go to the right case. Then there are a finite number of conditional operations in the last case if I wrote them all the way out. So how does it grow with the size of the input? – fgharo91 Jan 13 '19 at 02:22
  • 1
    @fgharo91 for a password of size 35 you do more ops than for a pass of size 34 which in turn has more ops than for a pass of size 33 and so on and so on. – bolov Jan 13 '19 at 02:24
  • 1
    look at it this way: `has3CharsInARow("asd")` will do less operations than `has3CharsInARow("asdfghijkl")` which will do less operations than `has3CharsInARow("asdfghijklmopqrstuvxyz")`. As the input size grows, your functions does more operations. – bolov Jan 13 '19 at 02:28
  • 1
    @fgharo91 for a function to be `O(1)` then the number of operations it does will not depend on the input size. – bolov Jan 13 '19 at 02:30
  • @bolov Since the number of operations is bounded by a constant (namely the worst-case behavior for a 35-character string), it is technically O(1). You can be O(1) and still be dependent upon the input, as long as the worst case is bounded by a constant. But as you noted, in practice, we are interested in growth as a function of n, even if ultimately bounded. – Raymond Chen Jan 13 '19 at 03:43
  • @RaymondChen I don't agree. The number of operations is not bounded by a constant. It is bounded by input in a linear fashion (i.e. `O(n)`). It's the input that is bounded. The function is defined on a domain. Outside of that domain the function is not defined. For the input domain that the function is defined the complexity is `O(n)`. By your logic every programming function that is defined on an finite input domain would be `O(1)` which is just ridiculous. By your logic every conceivable sort function would be `O(1)` because the input is bound by the max length of the programming language. – bolov Jan 13 '19 at 04:09
  • @bolov Actually just one more question. Just want to make sure I understand this point thoroughly. Suppose we go the other way with the bounds constraints. Suppose now we only ever take a string of length between 3 and 4. So there are only two possibilities that can happen a string of length 3 and a string of length 4. We still say O(n) right? – fgharo91 Jan 13 '19 at 16:16
  • The number of operations is bounded by a constant because the problem size is bounded. This is usually a rather unsatisfying conclusion (since practical computers cannot operate on truly unbounded inputs), so people generally assume the problem size is unbounded. But the fact that the problem is bounded by a constant (and finding that constant) is useful in certain circumstances, such as in hard real-time systems where you have to prove you can get everything done within a specific time limit. – Raymond Chen Jan 13 '19 at 16:26
  • @RaymondChen It so doesn't change the fact that the complexity is O(n) – bolov Jan 13 '19 at 19:28
  • @fgharo91 with only two inputs you do get in murky waters. I would say it's O(n) – bolov Jan 13 '19 at 19:30