2

A 3d matrix sorted in all three dimensions is given and we have to find a given number in it.

For the above problem, I have been thinking this: The 3D array arr[m][n][r] would be like a stack of rectangles where each rectangle (consider arr[m][n][0]) would have the largest element as the lowest right-most element(arr[m-1][n-1][0]). We can search inside each rectangle in O(m+n) :

int row = 0;
int col = N-1;

while (row < M && col >= 0) 
{
  if (mat[row][col] == elem) 
  { 
    return true;
  }
  else if (mat[row][col] > elem) 
  { 
    col--;
  } 
  else 
  { 
    row++;
  } 
}

I was thinking it could be similarly extended to the 3rd dimension , hence making it a linear complexity solution(O(m+n+r)). Am I right ?

Does anyone have any other ideas? What would be the complexity?

Mat
  • 202,337
  • 40
  • 393
  • 406
Rndm
  • 6,710
  • 7
  • 39
  • 58
  • 1
    You could probably use simple binary search on each dimension successively (3x). That would yield O(log(m) + log(n) + log(r)). A B-tree with a depth of 3 will basically do the same thing at the same complexity. – JimmyB Jul 21 '12 at 14:36

3 Answers3

5

You cannot extend a linear complexity 2D solution to the 3rd dimension, making O(m+n+r) solution out of it. 3D array, sorted independently in each direction, contains groups of O(N2) elements, which are not ordered between each other. For example, sub-array arr[i][j][k] where i+j+k = (m+n+r)/2 is completely unsorted. So you have to inspect each element of such a sub-array to find given number. This proves that you cannot invent an algorithm with complexity better than O(N2) (at least when m, n, and r are not very much different from each other). This is just an extension of the proof from this answer.

Here is an example:

k=0: |1 x|   k=1: |z 3|
     |y 3|        |3 4|

This array is sorted in all 3 dimensions. But this does not determine any sorting order for elements x,y,z. You can assign any values in the range of (1, 3) to these elements. And while searching for element with value '2' you have to inspect all these 'unsorted' values (x and y and z). If you increase size of the array, you see that the number of 'unsorted' values may increase quadratically. Which means worst case time complexity of the search algorithm should also increase quadratically.

You can find the smallest array size (let it be 'r'), and for each matrix arr[*][*][k] search given number in O(m+n) time, which gives O((m+n)*r) time complexity.

Or if one of the array sizes is much larger than others (r >> m*n), you can use binary search in arr[i][j][*] (for each i,j), which gives O(mnlog(r)) time complexity.

Community
  • 1
  • 1
Evgeny Kluev
  • 24,287
  • 7
  • 55
  • 98
  • I looked at the link you gave in your answer and understand that in 2d, we cannot achieve less than `O(n)` , but did not understand for 3d , the example of `arr[i][j][k]` -> that would be 1 element right , not a sub-array in 3d ? – Rndm Jul 22 '12 at 04:20
  • @shg: `arr[i][j][k]` is not just one element: here i,j,k are not constants, they are variables with the constraint that `i+j+k = (m+n+r)/2`. This constraint makes 2D slice out of 3D array. – Evgeny Kluev Jul 22 '12 at 08:11
1

If the memory consumption isn't a big issue, you can copy your array into single 1D-array of Pairs, then sort this array by values and perform binary search in it with O(log(n+m+r)) complexity. But the initial sorting would take O((n+m+r)*log(n+m+r)) which will define the total algorithm complexity.

I think thanking to the fact that 3D array is sorted in each dimention it is possible to find some algorithm that transforms it into sorted 1D-array faster than O((n+m+r)*log(n+m+r)), but I'm not aware of such things. Maybe Chiyou tried to explain this.

Sasha
  • 8,537
  • 4
  • 49
  • 76
1

I can think of a recursive solution which will be lograthmic, theoretically.

Say n is the number you are looking for in a cube NxNxN. This cube is sorted from east to west, north to south, top to bottom in ascending order. Such that the number at extreme north-east-top is smallest and the number which is at extreme south-west-bottom is largest.

Pick the number at the center of the cube. If this number is equal to n, then we found the number. If this number is greater than n, we can ignore all the numbers lying to south-west-bottom side because those are all going to be larger than n. These numbers are 1/8th of the cube. Now we can easily split the remaining part of the cube in 7 cubes and repeat the process.

Similar if this number is lesser than n, we can ignore all the numbers lying to north-east-top.

Point find(int[][][]matrix, int N, int n)
{
    if(N == 0)
    {
      return null;
    }

    int x = matrix[N/2][N/2][N/2];
    if( x == n)
        return new Point(N/2, N/2, N/2);
    else if (x > n)
    {
        for(Matrix m: SplitTheCubeInto8CubesAndIgnoreSouthWestBottomCube(matrix))
        {
            if((p = find(m, N/8, n)) != null)
                return p;
           else
                return p;
        }
    }
    else
    { 
        for(Matrix m: SplitTheCubeInto8CubesAndIgnoreNorthEastTopCube(matrix))
        {
            if((p = find(m, N/8, n)) != null)
                return p;
           else
                return p;
        }
    }
}

The complexity can be represented as follows: T(M) = 7*T(M/8)+ 1 (M is the total number of points lying in the cube. M = N*N*N. On each comparision, we are left with 7 cubes of M/8 size each) O = ln (M) [ln is on the base of 8/7] or O = ln (N)

Sandeep Giri
  • 821
  • 1
  • 7
  • 14