Would it be considered bad design to automatically set the 'V' parameter for the mahalanobis distance metric to cov(X) for all estimators that (possibly) use this metric?
I'm asking because it would facilitate the use of the mahalanobis distance in cross validation scenarios in general (because one won't have to set the V parameter over and and over again for each fold), or more specifically when used with GridSearchCV (because with the way things are now, it won't work with a KNeighborsClassifier using the malahanobis distance, for example).
On the other hand, adding code to many estimators' fit() methods just to fix this minor issue seems a bit messy to me. Could there be a more elegant way? I'd be glad to take suggestions and follow them up with a PR.