14

Perhaps this is elementary, but I cannot find a good example of using mahalanobis distance in sklearn.

I can't even get the metric like this:

from sklearn.neighbors import DistanceMetric
DistanceMetric.get_metric('mahalanobis')

This throws an error: TypeError: 0-dimensional array given. Array must be at least two-dimensional.

But, I can't even seem to get it to take an array:

DistanceMetric.get_metric('mahalanobis', [[0.5],[0.7]])

throws:

TypeError: get_metric() takes exactly 1 positional argument (2 given)

I checked out the docs here and here. But, I don't see what types of arguments it is expecting.
Is there an example of using the Mahalanobis distance that I can see?

tttthomasssss
  • 5,852
  • 3
  • 32
  • 41
makansij
  • 9,303
  • 37
  • 105
  • 183

2 Answers2

24

MahalanobisDistance is expecting a parameter V which is the covariance matrix, and optionally another parameter VI which is the inverse of the covariance matrix. Furthermore, both of these parameters are named and not positional.

Also check the docstring for the class MahalanobisDistance in the file scikit-learn/sklearn/neighbors/dist_metrics.pyx in the sklearn repo.

Example:

In [18]: import numpy as np
In [19]: from sklearn.datasets import make_classification
In [20]: from sklearn.neighbors import DistanceMetric
In [21]: X, y = make_classification()
In [22]: DistanceMetric.get_metric('mahalanobis', V=np.cov(X))
Out[22]: <sklearn.neighbors.dist_metrics.MahalanobisDistance at 0x107aefa58>

Edit:

For some reasons (bug?), you can't pass the distance object to the NearestNeighbor constructor, but need to use the name of the distance metric. Also, setting algorithm='auto' (which defaults to 'ball_tree') doesn't seem to work; so given X from the code above you can do:

In [23]: nn = NearestNeighbors(algorithm='brute', 
                               metric='mahalanobis', 
                               metric_params={'V': np.cov(X)})
# returns the 5 nearest neighbors of that sample
In [24]: nn.fit(X).kneighbors(X[0, :])     
Out[24]: (array([[ 0., 3.21120892, 3.81840748, 4.18195987, 4.21977517]]), 
          array([[ 0, 36, 46,  5, 17]])) 
rafaelvalle
  • 6,683
  • 3
  • 34
  • 36
tttthomasssss
  • 5,852
  • 3
  • 32
  • 41
  • 1
    how do you use the distance metric, in, say `nearest neighbors` or clustering? When I try to use it, I get `ValueError: Metric not valid for algorithm 'auto'`. – makansij Jan 07 '16 at 22:34
  • 1
    @Sother I've added a `NearestNeighbor` example to my answer. – tttthomasssss Jan 08 '16 at 08:11
  • Works for `NearestNeighbors`, but I don't see a `"metric_params"` for [`DBSCAN`](http://scikit-learn.org/stable/modules/generated/sklearn.cluster.DBSCAN.html) and clustering algos? – makansij Jan 08 '16 at 15:27
  • I tried using `dm = DistanceMetric.get_metric('mahalanobis',VI=icov)` distance function, and then `db = DBSCAN(eps=x, min_samples=1, metric='pyfunc', func='dm', algorithm='brute').fit(np.array(X_train_numeric))` but it doesn't recognize the `"func"` as a parameter. – makansij Jan 08 '16 at 15:36
  • 1
    @Sother I've never used `mahalanobis` distance with `DBSCAN`, but it looks like as if it is not yet properly supported for `DBSCAN` - I'd recommend opening an issue on github or asking on the `sklearn` mailing list. – tttthomasssss Jan 08 '16 at 17:31
  • I get the error 'DeprecationWarning: Got unexpected kwarg V. This will raise an error in a future version.' using your code snippet in the edit. This is because Scikit Learn has depreciated the 'V' argument for Mahalanobis distance. Now 'VI' must be used. This argument is documented [here](https://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.distance.pdist.html). This is set to inv(cov(X.T)).T, or `np.linalg.inv(np.cov(X_train.transpose())).transpose()` in python. – OscarVanL Mar 12 '20 at 18:40
  • @OscarVanL Whats your use-case? When you're using `DBSCAN`, you might need to precompute the pairwise distances yourself, then pass the distance matrix together with ` metric='precomputed'` to the algorithm. – tttthomasssss Mar 13 '20 at 10:49
  • I'm not using DBSCAN, I'm using K-NN with Mahalanobis distance as the distance metric. – OscarVanL Mar 16 '20 at 15:43
  • Do you know if it's possible to use kd tree or ball tree with mahalanobis distance in sklearn? brute force is not an option at the scale I'm working with – Ian Conway Aug 28 '20 at 18:38
  • `np.cov(X)` gave the following error when I used it to find LOF with Mahalanobis metric : "size of V does not match". Changed it to `np.cov(X.T)`. – Vandana Chandola Sep 14 '20 at 18:09
  • @tttthomasssss I have often seen that we get an error with Mahalanobis distance by specifying `V=` the covariance mtrix to the code. Instead, when I wrote `VI = `, it ran successfully! I read the documentation that VI is just the inverse matrix of V and it's actually optional(we can specify either of one) then why am I getting this error? Surprisingly, it doesn't even happen always, just sometimes. Any idea about this strange behavior? – QUEEN Feb 02 '22 at 03:53
3

in creating cov matrix using matrix M (X x Y), you need to transpose your matrix M. mahalanobis formula is (x-x1)^t * inverse covmatrix * (x-x1). and as you see first argument is transposed, which means matrix XY changed to YX. in order to product first argument and cov matrix, cov matrix should be in form of YY.

If you just use np.cov(M), it will be XX, using np.cov(M.T), it will be YY.

Jaewoolee
  • 31
  • 1
  • Thanks! I was getting an error about non-finite/NaN values until I transposed my training matrix like: `{'VI': np.linalg.inv(np.cov(X_train.T))}` – beyarkay Aug 16 '22 at 10:30