2

I have a .NET solution which contains multiple projects with in it.

Have identified some potential hot spots of the project which can be improved.

E.g. Considering a ternary operator wherever it is possible.

The question is this:

  1. Is this worth the effort ? Going through all the projects and lessening lines of code improves something apart from code beautification ?
  2. Are there any standard ways / standard steps that can be followed to improve the code base (performance/loading time/etc.,) ?

Asking this with respect to Visual studio 2015 & .NET realms.

  • 2
    performance is improved by identifying why things are slow and fixing them. Reducing the number of lines has nothing to do with it, usual improvements are better data structures, caching, better algorithms, .. – pm100 Jul 24 '17 at 15:52
  • Avoid the fallacy that the lines of code (LOC) you write equates to executed LOC. First , there is the expansion of language constructs into code to support their respective use (automatic code generation). Second, there are multiple levels of optimization that can take place; the language compiler (language compiler creates IL) can optimize and second the JIT compiler (compiles IL to native) can optimize code. For a good discussion on some of these issues, see:[The 'premature optimization is evil' myth](http://joeduffyblog.com/2010/09/06/the-premature-optimization-is-evil-myth/). – TnTinMn Jul 24 '17 at 16:59

4 Answers4

2

Replacing if statements with an equivalent ternary operator will usually have no impact whatsoever on the generated code. The compiler is quite good at optimizing these things to the exact same sequence of instructions regardless of how you write them at a high level.

Generally, dumb reduction of the number of lines of code without any actual restructuring of the code will not have any effect on the size of the project.

Restructuring the code may have an effect, but unless you have really stupid things going on, like completely unused or completely useless code, you have to be smart in how you restructure things, and it requires a lot of work.

Usually, performance and loading time improvements require more code, not fewer code. For example, if you are hitting a certain repository a lot during startup, and this slows you down, then what will make your program load faster is the introduction of a caching mechanism that reduces the number of hits to the repository. This caching mechanism will need to be added, so that will be more code.

Getting rid of unused libraries, configuring libraries to do less during startup, replacing libraries with just a couple of functions if that's all the functionality you use for them, is usually the lowest hanging fruit.

Mike Nakis
  • 56,297
  • 11
  • 110
  • 142
2

in terms of ternary operator, it can be slower than an If/Else (Ternary ? operator vs the conventional If-else operator in c#)

for performance, consider looking at your hotspots, and look at what they are doing. Look at your O(n) performance. For example, are you constantly updating and then resorting a list? if so, maybe you could look at a different List container that has sorting built-in.

Or maybe you can shift some logic to outside of a loop; either before or after. Because nested loops are slow.

Can you use a "fire and forget" pattern and not worry about something? something like Task.Run(A);. this won't return a result back, but it will start a new thread, possibly moving your hotspot to an acceptable location.

And, understand that as you optimize these blocks, new hotspots will appear.

To determine where your hotspots are, in VS2015, follow the MSDN article https://msdn.microsoft.com/en-us/library/ms182372.aspx

2174714
  • 288
  • 2
  • 10
  • Good answer. But any ideas where and how to start for these hotspots ? – now he who must not be named. Jul 24 '17 at 15:45
  • I would avoid saying that the ternary operator is slow. That post and answer are from 2012 and I'm unable to duplicate the results. In fact, the ternary operator comes out ahead for me given the test used in that answer. However, I can change the performance of the if statement by inverting it. So really the performance difference has nothing to do with ternary vs if and instead is really coming down to branch misprediction in the processor. This makes more sense to me since a ternary and a simple if compile down to nearly identical IL. – Kyle Jul 24 '17 at 16:34
  • The answer that you linked to (and the one it is a duplicate of) do not support your theory that the ternary operator is slower than `if/else`. – Mike Nakis Jul 24 '17 at 18:16
1

The short answer is: it depends.

There are things you can do make your code more efficient such as storing calculations in variables (if you plan on reusing the result for other tasks) or using algorithms to better navigate your data instead of iterating through each item 1 by 1.

But removing unnecessary lines? Not really. Having unused methods or references doesn't really add slack in terms of actual performance. Sure it's kind of annoying to have it there but all it does is add to compile time and increase your project size. If your program isn't using it then it's not adding anything more to it's performance issues.

Something you should consider is that maintaining DRY (Don't Repeat Yourself) code has more benefits than just "beautifying" your code. It helps with maintainability, readability, and makes group work much much easier. It's part of a process called "refactoring" and you can find more info here.

Capn Jack
  • 1,201
  • 11
  • 28
1

For fun, I thought I'd try a little test.

I thought:

We know that C# gets converted to MSIL, so it makes sense that the more code there is, the more MSIL there would be and therefore the slower the load.

But, it turns out this is not the case.

I tried dynamically loading 2 dll's:

  • One was nearly empty,
  • The other had 100,000+ lines of code in.

Sample code below:

private void LoadTest()
    {
        // Startup test just in case the loader needs priming....
        var dll0 = Assembly.LoadFile(@"C:\Users\me\Documents\visual studio 2015\Projects\DeleteMeApp\DeleteMe0\bin\Debug\DeleteMe0.dll");

        Stopwatch st = new Stopwatch();
        st.Start();
        // This dll is nearly empty
        var dll1 = Assembly.LoadFile(@"C:\Users\me\Documents\visual studio 2015\Projects\DeleteMeLib1\bin\Debug\DeleteMeLib1.dll");
        st.Stop();
        var time1 = st.ElapsedMilliseconds;

        Stopwatch st2 = new Stopwatch();
        st2.Start();
        // This dll has over 100,000 lines of code in
        var dll2 = Assembly.LoadFile(@"C:\Users\me\Documents\visual studio 2015\Projects\DeleteMeApp\DeleteMeLib2\bin\Debug\DeleteMeLib2.dll");
        st2.Stop();
        var time2 = st.ElapsedMilliseconds;

    }

Results

  • Run 1: Both 16ms
  • Run 2: Both 25ms
  • Run 3: Both 18ms

I didn't quite understand the results. I thought maybe the compiler was being extra clever and my 100,000+ lines of code was somehow being compiled to nothing.

So, I followed this post to view the MSIL and the large file had significantly more MSIL code than the small file.

Next Test

Still in disbelief, i added 100,000 different public void methods to the 100,000+ lines of code and re-ran the test.

  • Run 1: Both 11ms
  • Run 2: Both 9ms
  • Run 3: Both 10ms

ildasm.exe became unresponsive, so I'm guessing the IL was pretty big.

Conclusion

No; it seems the number of lines of code do not relate to the load time.

JsAndDotNet
  • 16,260
  • 18
  • 100
  • 123