1

As much as i've searched online, I cannot find examples of algorithms that aren't solvable in practice due to the amount of time that they'd take to compute.

I was trying to think of an example such as counting the number, size, and location of each star that passes closest to a rocket ship on it's trip to alpha centauri. Would this be a good example? I mean, the star system is nearly 26 trillion miles away.

EDIT: I'm doing a short presentation on Big-O and Little-O notation and wanted to think of something out of the ordinary as to WHY solutions to problems can be solvable in practice, but they may not be solvable in principle due to the extremely large amount of time to compute, thus why Big-O is used to create estimations. The reason I wanted to go with stars is because it seems more interesting than some other subjects.

Thanks!

7 Answers7

2

Imagine you have an array denoting a deck of 52 cards [AS, 2S, 3S, ... , JH, QH, KH] and you want to return all the different ways these cards can be shuffled.

The first card has 52 possibilities, the second card has 51, the third has 50, and so on. There are a total of 52 factorial (52 * 51 * 50 * ... * 3 * 2 * 1) combinations, which is over 10^67 (close to the number of atoms in the universe I think).

A lot of complex problems arise from having to reuse the same data in many different ways, not just having a lot of data.

Lorenzo
  • 1,605
  • 14
  • 18
  • This is not very interesting since the algorithm is linear in terms of output size – pkacprzak Oct 16 '16 at 05:03
  • @pkacprzak The output size is actually O(n!) – Lorenzo Oct 16 '16 at 05:06
  • Suggestion: someone has randomly shuffled a deck and you're trying to guess the exact permutation, but they will only say "yes" or "no" so you have to brute force. – dwks Oct 16 '16 at 05:06
  • @Lorenzo, yes that's what I meant. Algorithm runs in O(n!) so it's linear in terms of output (not input) size. – pkacprzak Oct 16 '16 at 05:07
  • @Lorenzo Sticking to the "stars" concept, what if I collected all the stars I passed by in an array, then returned all the different ways they could be shuffled? In fact, what if this shuffling had to occur every time a new star was added to the array? Would this be acceptable? –  Oct 16 '16 at 05:16
  • @FoxDonut That's not the point. You don't need a large number of elements like a huge number of stars before the number of combinations becomes very very large. – dwks Oct 16 '16 at 05:17
  • @FoxDonut What dwks said, but to answer your question yes that would be an equally complex problem that would have even more data and need to be executed even more times. – Lorenzo Oct 16 '16 at 05:21
  • 1
    @Lorenzo I understand the point you're making and i just really want to drive the point across. I like your example a lot because it's simple in principle, but can take a long time to process. I just wanted to take it a step further. "Take something you can't even imagine, then imagine that 20 more times" hahah. But really, as far as time is concerned, the example isn't really solvable in principle, though it is expressed as O(n!) when estimating. Thanks! –  Oct 16 '16 at 05:26
  • This is something out of topic but 10^67 is close to the number of atoms in Milky Way. https://www.quora.com/How-many-atoms-are-there-in-the-milky-way-galaxy – khôi nguyễn Oct 16 '16 at 06:54
1

You should definitely look into NP-hard problems, like the Travelling Salesman Problem. These are great teaching examples because they can be "solved" with different strategies and we're still working on them.

The algorithms that guarantee a correct answer are often too resource-intensive to implement in real life, as you describe. Thanks to Big O, we know that O((n^2)*(2^n)) is just not feasible to run on machines for decently sized inputs.

We are forced to compromise with algorithms that perform better, but may not give the best or correct result. These algorithmic compromises can be seen in a lot of relatable, real life examples - one neat case is generating a Tour of 50 USA Landmarks.

J. Song
  • 85
  • 6
0

Your problem isn't amazing because it looks like it is of linear complexity (O(n) time). If you doubled the length of your trip the length of time it takes your program to execute only doubles. Still, you're probably just looking for a problem with an exponential solution like the travelling salesman problem which quickly becomes practically unsolvable as you increase the number of cities. Look up Time Complexity for more information.

Another interesting thing you could try looking at is the halting problem.

0

With state-of-the-art algorithms as of 2009, it took two years to factor RSA-768, which has "only" 232 decimal digits -- and that was done by using a bunch of computers in parallel.

One can speculate how long it would take to factor RSA 2048, which has 617 decimal digits.

YSharp
  • 1,066
  • 9
  • 8
  • This is REALLY cool, and a great example. However a little too complex to for me to present to a class without explaining RSA-768/2048. I greatly appreciate this though, it was a good example. –  Oct 16 '16 at 05:29
0

What you should really look for is not the complexity of algorithm, but complexity of the underlying problem - one can use very inefficient algorithm for sorting, for example iterating over all possible O(n!) orders, but it doesn't mean that sorting takes much time in practice.

Let's consider one of the most fundamental string problems Longest common subsequence of two strings of lengths n.

It is known that the problem can be solved in O(n^2) using dynamic programming method. On the other hand, it is also proven (paper) that for general instance of the problem, any algorithm requires Ω(n^2) operations (faster algorithms are possible in some special cases).

Thus, if you want to solve a general instance of this problem for strings of million of characters (nothing very big for modern computing), in fact, any algorithm will take time proportional to 10^12 operations. In fact, in computational biology, problem of finding Longest common subsequence of DNA sequences if very important and these sequences are extremely long, so it is a good example to real-world problem you asked for.

pkacprzak
  • 5,537
  • 1
  • 17
  • 37
0

The main purpose of Big-Oh notation isn't to estimate a program's runtime on some particular input. It's to analyze how quickly the program slows down as you increase its input size. Does an input twice the size take twice as long to run, four times as long, or sixteen times?

The most illustrative example for Big-Oh is where the problem has a very small input but takes a long time to run. And when you increase the input size just a little bit, it takes far longer yet. Some of the fastest growing algorithms take exponential or factorial time, and they usually involve looking at all permutations of elements (all different orders, or ways the elements can be arranged).

Hence, a good example is, someone set a 10-digit passcode on a safe (digits from 0-9) and you have to guess it. Well, there are 10^10 combinations, and you'll guess an average of 5*10^9 times before getting it right. But if it's an 11-digit passcode, guessing will take you ten times longer.

dwks
  • 602
  • 5
  • 10
  • Very good example! But, isn't exponential time essentially the same as factorial time? –  Oct 16 '16 at 05:33
  • My example grows ten times for each increase in n, thus it is O(10^n). That's exponential time. Factorial time is O(n!) which means for each increase in n, it has to grow by a factor of n. For instance, reordering all cards in a deck and trying to guess a particular order. – dwks Oct 16 '16 at 05:35
  • So the card-ordering example will grow much faster for sufficiently large n. But I think this may be a simpler example when you're thinking, it grows 2x, 10x, 16x faster... wait, what if it grew n times faster? etc. – dwks Oct 16 '16 at 05:41
0

A german hacker is trying to log on to the NSA's servers to find out the nuclear missile launch codes. He knows all NSA passwords must be at least 16 characters, the admin password is randomly generated every day at midnight, and it can include all of ASCII. He has 26 days before WWIII starts.

Should he try to save the world? or go on vacation. Only big O notation will tell.

I hope this example was "interesting".

Funny Geeks
  • 483
  • 5
  • 12