5

I have an application that we upgraded form .NET 3.5 to .NET 4.7.2. The only problem really is performance from part of our code that uses reflection. Entire situation is best explained in a simple sample I uploaded to Gist: https://gist.github.com/vkocjancic/3e8a6b3496c412a75b1c85a1d2ba1111

Basically, we have a POCO class, which property methods throw exceptions, if value is not set for properties of non-nullable types.

[EDIT]:

  1. Yes, I know that is not correct or a good pattern to use, but, the application started in .NET 1.1.

  2. Yes, it should have been fixed long ago. It wasn't.

A reflection is then used to get property names and values of instances of POCO class and popluate it to DataTable.

The sample and real life project source code is exactly the same. The only difference is that in one case it is compiled using .NET 3.5 and in second using .NET 4.7.2.

This is the average time elapsed in milliseconds for 10 invocations:

.NET 3.5      ->  231.1 ms
.NET 4.7.2    ->  713.5 ms
.NET Core 2.2 -> 1013.2 ms

Can anyone elaborate why reflection is about 3 times slower in .NET 4.7.2 than in .NET 3.5 and how to fix that. The lag only happens, when properties are not set and throw exceptions. If you populate property values, there is no difference in performance.

If I run sample in debugger, .NET 3.5 build never triggers MissingFieldException. Build using .NET 4.7.2 triggers every exception.

[EDIT]

As @bevan mentioned, the slowdown actually occurs when switching to .NET 4.0. > Also the problem boils down to exception being thrown and handled 3 times faster in .NET 3.5 than it is in .NET 4.0

Vladimir Kocjancic
  • 1,814
  • 3
  • 22
  • 34
  • Creating StackTraces and throwing exceptions is very expensive, sounds to me like that is the problem of your performance issues. Maybe you can switch to a design pattern that doesn't rely on exceptions. – Charles Jan 21 '20 at 21:57
  • 1
    I agree and I would really like to. However, that means changing 90% of our domain POCO classes. Would rather avoid that, if possible. Also. It still doesn't explain 3x difference between .NET 3.5 and 4.7.1 – Vladimir Kocjancic Jan 21 '20 at 21:58
  • 2
    random idea, but take a look at this first example here using FastMember and `ObjectReader.Create` / `table.Load`... https://github.com/mgravell/fast-member#ever-needed-an-idatareader - any use? it does a lot of work for you to not be terrible at reflection (it also has optional args to define which properties and in which order, to read) – Marc Gravell Jan 21 '20 at 22:05
  • well you could get your boolean field with reflection: `_fPropNumericA` and then check that before calling `GetValue()` for the real field, that would still be horrible, but at least wouldn't trigger any exceptions. – Charles Jan 21 '20 at 22:06
  • Exceptions and performance aside, I think there is perhaps a problem of design that is being overlooked. Why should a program be written in such a way that _merely reading a property_ should throw an exception? It's not the callers fault that it is null. Throwing an exception signifies to the caller that the object is not in a valid state, which begs the question - _why was the object allowed to be created in the first place with null values?_. If a property shouldn't be null, then best practice tells us that a valid value be passed to the class's constructor at the time it is created –  Jan 21 '20 at 23:17
  • @MickyD.. I get what you are saying. Not a nice practice. But let's get real. This application is from times of .NET 1.1. It has been upgrade to .NET 2.0 and then stalled at .NET 3.5 for several years. It brings along entire history of C# language. Yes, it should have been fixed when upgraded to .NET 2.0, .NET 3.5, but it wasn't. I wish it was. It wasn't. I agree throwing exceptions if value is null is dumb, wrong etc. etc. It doesn't change the reality that upgrade should not cause 3 time delay. And it doesn't change the reality that the code is as is and that there is no budget to fix that. – Vladimir Kocjancic Jan 21 '20 at 23:21
  • @MarcGravell... Unfortunatelly, the application uses some custom abstraction of DataSet instead of DataTables. I just didn't want to complicate my sample further. – Vladimir Kocjancic Jan 22 '20 at 07:10
  • 2
    @VladimirKocjancic fair enough, but my point is: FastMember *exists* to provide some mechanism to avoid the slowest parts of reflection; that example is just one usage, specific to `DataTable` - but there are APIs for lots of other purposes too. Normally, if performance is a factor, you would need to get into some of the more complex meta-programming APIs - typically either "expression" or "ref-emit"; those are both **hard**; FastMember provides an accessible surface over that - not as good as full meta-programming, but better than raw reflection. – Marc Gravell Jan 22 '20 at 07:51
  • @VladimirKocjancic _"This application is from times of .NET 1.1. It has been upgrade to .NET 2.0..and then...3.5 ...4.7...It doesn't change the reality that upgrade should not cause 3 time delay"_. Perhaps, but I would argue against upgrading to later versions of .NET if it wasn't necessary. Though for the most part code should compile without incident, there are **runtime differences**. Not to mention **upscaling issues**. –  Jan 22 '20 at 08:20
  • @VladimirKocjancic Whilst there are different CLR versions for say .NET 1.1;.NET 2 and .NET 4, _[".NET 4.5 is an "in-place upgrade" of .NET 4. That means .NET 4.5 is still the v4 CLR and adds new libraries as well as improvements **to the core CLR itself.**](https://www.hanselman.com/blog/NETVersioningAndMultiTargetingNET45IsAnInplaceUpgradeToNET40.aspx)_. Like c/c++, you shouldn't upgrade your compiler (or in this case .NET Framework) just because a new one is now available. You never know what _runtime behaviour differences_ you'll encounter –  Jan 22 '20 at 08:22
  • In saying all that, have you looked into Reflection Caching and/or Fastmember? –  Jan 22 '20 at 08:39
  • 1
    Running your benchmarks a number of times, I've found two noteworthy details. Firstly, the slowdown occurs when you switch targeting between .NET 3.5 and 4.0 - later versions of .NET 4.0 make negligible difference. Secondly, commenting out the **throw** statements results in .NET 4.0 running around twice as fast, so the performance difference you're seeing is due to differences in *exception throwing/handling* not reflection. – Bevan Jan 22 '20 at 08:43
  • @Bevan... Actually it is reflection. ```PropertyInfo.GetValue``` method didn't trigger an exception in .NET 3.5. Now it does. Cheers on going in deep to find out that it is actually 3.5 to 4.0 change that caused the problem. – Vladimir Kocjancic Jan 22 '20 at 09:12
  • @Bevan I have to apologise. It is indeed a problem of exception throwing and handling. And apparently it is 2 times faster in .NET 3.5 than 4.0 – Vladimir Kocjancic Jan 22 '20 at 13:11
  • The problem is the idea that newer versions of a runtime should always improve times for every operation. They don't. Generally runtime improvements focus on the happy paths (things average applications need to do thousands of times per second); exception handling is not one of those things and is more likely to be optimized towards gathering more details and better debuggability, rather than speed. That's not to say it couldn't possibly be optimized, of course, just that you can't expect this to actually happen, especially not with efforts focused on .NET Core. – Jeroen Mostert Jan 22 '20 at 13:19
  • @JeroenMostert I don't expect things to always improve. I just expect that they don't get worse. – Vladimir Kocjancic Jan 22 '20 at 13:21
  • 1
    Well, that assumption is of course just as flawed. .NET 4.0 isn't a small incremental improvement over .NET 2.0, it's substantially different at the ground level. Making sure changes did not regress the speed of exception handling was almost certainly not a goal. For the vast majority of applications, using the 4.0 runtime over the 2.0 runtime does not regress and in fact improves running times. Your application is an unusual case where it doesn't. Your choices are to fix your app, stick with 3.5, try your luck with .NET Core and/or [open an issue](https://github.com/dotnet/runtime). – Jeroen Mostert Jan 22 '20 at 13:24
  • @JeroenMostert Can you back your statements with some reference, please? – Vladimir Kocjancic Jan 22 '20 at 19:46
  • 1
    [The 4.0 runtime is not a simple patch of the 2.0 runtime](https://stackoverflow.com/q/1626368/4137916). [Most efforts are currently focused on .NET Core because it is the next release of .NET as a whole](https://devblogs.microsoft.com/dotnet/introducing-net-5/). [The 4.6 branches have significant performance improvements due to a completely new JIT](https://devblogs.microsoft.com/dotnet/the-ryujit-transition-is-complete/). That not regressing exception handling speed was probably not a goal is only an educated guess (hence the "probably"). – Jeroen Mostert Jan 22 '20 at 22:29
  • 1
    Anyone with both frameworks, a native debugger, the experience to operate it and some patience could trace through the application running under both frameworks and determine where the actual performance difference(s) are located. This is fairly time consuming and involved, though, and no guarantee of yielding a workable fix if it turns out the framework itself should be changed (improving performance for one scenario may well regress it in others). Still, it's better than speculation. – Jeroen Mostert Jan 22 '20 at 22:39
  • @JeroenMostert Thanks. Will check those. The problem I have with performance regression is that 99% of framework methods throw exceptions. If you want people to not use exception, unless something exceptional happens, maybe the way to go is to stop framework from throwing exceptions all over the place and instead return status. Much like C++ does. Then use exceptions for really extraordinary things. But this now exceeeds the scope of this question. thanks for explanation. – Vladimir Kocjancic Jan 23 '20 at 08:42

1 Answers1

4

As i mentioned in the comments, if you call the IsPROP_NUMERIC_ANull function before you access the properties, it wont throw any more exceptions and is super fast.

foreach (var myObject in objects)
{
    var row = table.NewRow();

    // TODO implement some caching: dont call GetProperties(), GetMethod(), for every object
    var type = myObject.GetType();
    var properties = type.GetProperties();

    foreach (var info in properties)
    {
        try
        {
            // call the IsNullMethod for the property, eg "IsPROP_NUMERIC_ANull"
            var isNullMethodName = $"Is{info.Name}Null";
            var isNullMethod = type.GetMethod(isNullMethodName, BindingFlags.Instance | BindingFlags.Public);

            if (isNullMethod != null)
            {
                var fldIsNull = (bool)isNullMethod.Invoke(myObject, new object[] { });
                if (!fldIsNull)
                    row[info.Name] = info.GetValue(myObject, null);
            } else
            {
                // isNullMethod doesn't exist
                row[info.Name] = info.GetValue(myObject, null);
            }
        }
        catch
        {
            // do nothing
        }
    }
    table.Rows.Add(row);
}

Total runtime is 3 ms vs 19500 ms you had before.

If you can redesign this i would use Cached Delegates like Marc Gravell proposed.

slugster
  • 49,403
  • 14
  • 95
  • 145
Charles
  • 2,721
  • 1
  • 9
  • 15
  • I thought of that and I already implemented. The problem is that there are 20% of properties which throw exceptions that don't have IsNull Method, because, well, what can I say, legacy code. – Vladimir Kocjancic Jan 22 '20 at 07:07
  • @MickyD... I upvoted the answer. :) Mind you. Still no response as to why .NET 4.7.2 takes 3 times as long for the same task as .NET 3.5 :) – Vladimir Kocjancic Jan 22 '20 at 09:09
  • @VladimirKocjancic well done good sir. I would like to know why it is so too :) –  Jan 22 '20 at 10:27