5

If I had an array of integers with 1000 elements, what would be the fastest way to find a specific element's index? Would it be best to first QuickSort it and then use BinarySearch or just to use plain old LinearSearch?

Also, would the fastest method be different if I had 100 000 elements or even just 100 elements?

Thanks!

Bobby
  • 1,416
  • 1
  • 17
  • 30
  • 3
    At what n value does O(n) surpass O(n log n)? That will help you determine which to use – OneCricketeer May 07 '16 at 14:16
  • 2
    It would require extra effort to recover the element's index if you sort the list, since that obviously destroys the simple relationship between array element and element index. – Andy Turner May 07 '16 at 14:18
  • 1
    Related: http://stackoverflow.com/questions/487258/plain-english-explanation-of-big-o – OneCricketeer May 07 '16 at 14:18
  • 2
    @cricket_007 O(_n_) and O(_n_ log _n_) are not functions of _n_. When you say that the asymptotic time complexity of some algorithm is O(_n_ log _n_), what you're saying its that the actual time taken by the algorithm (let's call it T(_n_)) approaches _kn_ log _n_ for some constant _k_ as _n_ increases without bound. If you want to decide at what value of _n_ you should switch from one algorithm to the other, then you need to know the actual time functions T(_n_) and U(_n_) of the algorithms. You can't do it just from their big O complexity. – Solomon Slow May 07 '16 at 14:31

5 Answers5

6

Linear search would be better. The worst case of O(N) for linear search is less than quicksort alone (average O(nlog n) but worst case O(N^2)) and then you would need to add the binarysearch (O(log N)). Sorting and using binary search would be better if you need to search many times (if you can amortize the sorting cost, then binary search is more efficient than linear search).

Elliott Frisch
  • 198,278
  • 20
  • 158
  • 249
4

As the other answers say, linear search is better if you only want to find one member of the list. But if you want to find many members of the same list, then you'd be better off sorting it one time, and then doing many binary searches.

Solomon Slow
  • 25,130
  • 5
  • 37
  • 57
1

Quick sort computation overhead is O(nlog(n)) in best case and O(n^2) in worst case and binary search is O(log(n)) so together in worst case they take O(n^2).
No need to mention that linear search takes O(n)
So it depends on your number of operations and input size. You may multiply your average times of searching in both worst cases (k
W(n) which k is times of searching and n is size of array and W(n) is worst case of each approach) and then decide what to do.

Alireza Mohamadi
  • 511
  • 5
  • 19
1

Using linear search is quicker because it will take O(n) as compared to O(nlogn) time consumed by quick sort alone. However, to improve its time in case of large number of elements, try creating multiple chunks of array and run search algorithm in parallel. Have a Look at PPL(Stl library) for more on this.

Himanshu Gupta
  • 305
  • 3
  • 12
1

A linear search will compare each item with another to find the corresponding index.

A quicksort will begin with comparing each item with the pivot in order to do the first partitioning step. Then there are more operations afterwards.

The first step of the quicksort already takes as much time as the whole linear search, so it will take at least as long as the linear search. This will be the same for different list sizes.

fgb
  • 18,439
  • 2
  • 38
  • 52