1

I needed to reduce time complexity of the transform due to performance issues as CalculateFeatureContribution provided by ML.NET 1.7.0(https://learn.microsoft.com/en-us/dotnet/api/microsoft.ml.explainabilitycatalog.calculatefeaturecontribution?view=ml-dotnet) is increasing the time taken multifolds

        var testSet = mlContext.Data.LoadFromEnumerable<ModelInput>(evalset);

        IDataView predictions;
        using (var v = new startwatch("Evaluate"))
        {
            predictions = model1.Transform(testSet); //evaluates each testcase and computes an array of feature contribution.

        }

This generic function is used to evaluate the testcases.

Time complexity when CalculateFeatureContribution is included in training pipeline is 4.486 ms per test item.

Time complexity when CalculateFeatureContribution is NOT included in training pipeline is 0.756 ms per test item.

The excess time taken is to compute weight of each feature for each item, but is there a possible way to reduce the time taken when CalculateFeatureContribution is included.

Abhay Gaur
  • 11
  • 1

0 Answers0