5

We have an issue with some of our ASP.Net applications. Some of our apps claim a large amount of memory from start as their working set.

On our 2 webfarm-servers (4gb of RAM each) run multiple applications. We have a stable environment with about 1.2gb of memory free.

Then we add an MVC5 + WebApi v2 + Entity Framework app that instantly claims 1+gb as working set memory, while only actually using about 300mb. This causes other applications to complain that there is not enough memory left.

We already tried setting limit for virtual memory and the limit for private memory, without any avail. If we set this to about 500mb, the app still uses more or less the same amount of memory (way over 500) and does not seem to respect the limits put in place.

For reference, I tested this with an empty MVC5 project (VS2013 template) and this already claims 300mb of memory, while only using about 10mb.

Setting the app as a 32-bit app seems to have some impact in reducing the size of the working set.

Is there any way to reduce the size of the working set, or to enforce a hard limit on the size of it?

Edit: In the case of the huge memory use for the project using Web Api v2 and Entity Framework, my API controllers look like this:

namespace Foo.Api
{
public class BarController : ApiController
{
    private FooContext db = new FooContext(); 

    public IQueryable<Bar> GetBar(string bla)
    {
        return db.Bar.Where(f => f.Category.Equals(bla)).OrderBy(f => f.Year);
    }
}

as they look in most tutorials I could find (including the ones from microsoft). Using using here does not work because of LINQ deferred loading. It could work if I added a ToList (not tested) everywhere, but does this have any other impact?

edit2: It works if I do

namespace Foo.Api
{
public class BarController : ApiController
{
    public List<Bar> GetBar(string bla)
    {
        using(FooContext db = new FooContext){
           return db.Bar.Where(f => f.Category.Equals(bla)).OrderBy(f => f.Year).ToList();
        }
    }
}

Does the ToList() have any implications on the performance of the api? (I know I can not continue querying cheaply as with an IQueryable)

Edit3: I notice that its the private working set of the app that is quite high. Is there a way to limit this? (Without causing constant recycles)

Edit4: As far as I know I have a Dispose on each and every APIController. My front-end is just some simple MVC controllers but for the large part .cshtml and javascript (angular) files.

We have another app, just regular mvc, with two models and some simple views (no database, or other external resources that could be leaked) and this also consumes up to 4-500mb of memory. If I profile it, I can't see anything that indicates memory leaks, I do see that only 10 or 20 mb is actually used, the rest is unmanaged memory that is unassigned (but part of the private memory working set, so claimed by this app and unusable by any other).

Kevin
  • 775
  • 2
  • 13
  • 32
  • What else are you going to use that memory for? If it's your Web server, it should be used for your Web application. There could be multiple reasons for it -- to test if it's really your application, start up a new Hello World application on an Amazon EC2 instance, and if it happens there, then you know it's the framework and not you. If it is you, then you need to profile your application and see if you're either 1) leaking memory, or 2) making too many long lived allocations. – George Stocker Feb 03 '15 at 14:35
  • 2
    By claiming that memory, it starves other applications (on the same server) from using it. There is an other app with varying memory use, that complains because all the memory is claimed. We just want to put a limit onto how much virtual memory it can claim, so there is room for other applications to use more memory if needed. – Kevin Feb 03 '15 at 14:50
  • By other application, do you mean another web app? Or something else? If it's something else, it needs to not be on that server, that's your web server. If it's another Web application through IIS; then my first set of advice is key, profile your code, make sure it's not your code (it likely is). – George Stocker Feb 03 '15 at 15:20
  • 2
    It's another web application through IIS. We are profiling and we can see that a large amount (as in 700mb+) is just reserved memory for .net that is not being used. I'll try and add a screenshot later. – Kevin Feb 03 '15 at 15:23
  • Most people misinterpret memory charts... They usually don't mean what you think they do. Apps reserve a lot of memory, but if it's not committed, that memory is not actually in use. Further, it's very difficult to truly grasp how much memory an app is using because memory is often shared between processes, such as the .NET framework itself.. all apps share a common instance of the DLL's. – Erik Funkenbusch Feb 03 '15 at 22:42
  • @ErikFunkenbusch After some research I was thinking in the same direction, but even if it's not actually in use, another web app running on the same server was not able to access it when needed, so it's not "free" in the sense that another website could use it. – Kevin Feb 04 '15 at 08:40
  • @KWyckmans - it sounds like your apps are sharing the same app pool. You should put them on different pools, and thus they will have their own worker processes and thus their own address space. – Erik Funkenbusch Feb 04 '15 at 16:14
  • Each website has it's own app-pool with it's own worker process. – Kevin Feb 04 '15 at 19:23
  • @KWyckmans What makes you think its using that memory? Do you understand the principles by which a GC work, and how they work much better when they have twice as much memory as they need? The fact is with a Managed Memory environment, you should let the environment manage the memory unless its not working. – Aron Feb 06 '15 at 09:36
  • Yeah, I do understand this. But we saw really high use for applications that were basically nothing. I was just wondering how and why one can steer and maybe limit this process. Just saying: its asp just claiming memory won't fly with our serveradmins if its a one pager using upwards of 500 mb. – Kevin Feb 06 '15 at 10:12

2 Answers2

4

I had a similar problem with some of my applications. I was able to solve the problem by properly closing the disposable database resources by wrapping them in using clauses.

For Entity Framework, that would mean to ensure you always close your context after each request. Connections should be disposed between requests.

using (var db = new MyEFContext())
{
   // Execute queries here
   var query = from u as db.User
               where u.UserId = 1234
               select u.Name;

   // Execute the query.
   return query.ToList();

   // This bracket will dispose the context properly.
}

You may need to wrap the context into a service that request-caches your context in order to keep it alive throughout the request, and disposes of it when complete.

Or, if using the pattern of having a single context for the entire controller like in the MSDN examples, make sure you override the Dispose(bool) method, like the example here.

protected override void Dispose(bool disposing)
{
    if (disposing)
    {
        db.Dispose();
    }
    base.Dispose(disposing);
}

So your controller (from above) should look like this:

namespace Foo.Api
{
    public class BarController : ApiController
    {
        private FooContext db = new FooContext(); 

        public IQueryable<Bar> GetBar(string bla)
        {
             return db.Bar.Where(f => f.Category.Equals(bla)).OrderBy(f => f.Year);
        }

        // WebApi 2 will call this automatically after each 
        // request. You need this to ensure your context is disposed
        // and the memory it is using is freed when your app does garbage 
        // collection.
        protected override void Dispose(bool disposing)
        {
            if (disposing)
            {
                db.Dispose();
            }
            base.Dispose(disposing);
        }
    }
}

The behavior I saw was that the application would consume a lot of memory, but it could garbage collect enough memory to keep it from ever getting an OutOfMemoryException. This made it difficult to find the problem, but disposing the database resources solved it. One of the applications used to hover at around 600 MB of RAM usage, and now it hovers around 75 MB.

But this advice doesn't just apply to database connections. Any class that implements IDisposable should be suspect if you are running into memory leaks. But since you mentioned you are using EntityFramework, it is the most likely suspect.

NightOwl888
  • 55,572
  • 24
  • 139
  • 212
  • 1
    Any idea on how to use this for Web Api v2? due to linqs deferred loading this won't work as is. – Kevin Feb 03 '15 at 16:23
  • @NightOwl888 could you elaborate on the custom `IControllFactory`? Further, would the IoC pattern be a possible solution to keep control of disposing those objects? (E.g. letting the IoC inject service/repositories or even the specific DbContext into the controllers. And by doing that, the IoC takes control of the disposing)? Just an idea... – Yves Schelpe Feb 03 '15 at 22:32
  • @YvesSchelpe - using DI will only help to solve this if the container you use has a way to wire up a per-request object and has a mechanism to dispose of it properly. Not all of them do. IMHO, fixing memory leaks is not a valid reason to start using DI, but if you are using DI you must still ensure that unmanaged resources (such as database connections and file streams) are disposed properly, it is just the way you ensure this shifts to a different part of the application. – NightOwl888 Feb 04 '15 at 12:33
  • @NightOwl888 ofcourse fixing memory leaks with DI is not the intended way. That's not what I meant, 't was more in the mindset of the question KWyckmans asked you above... For one I know that the StructureMap and AutoFac containers do handle disposing of objects, providing you set it up properly ofcourse. Anyway, as I said, 't was to let it fit in the scenario KWyckmans asked you. – Yves Schelpe Feb 04 '15 at 12:56
  • I think I have this implemented on every api controller I use, but I will have to recheck. Thanks for the additional info. – Kevin Feb 04 '15 at 19:25
  • I'm trying to find the culprit of high memory usage for a server currently and this was the first thing that occurred to me to, however reading more about it suggests that disposing of DataContexts is not actually all that important and you'll probably want to keep looking for the true source of the memory problems: https://blog.jongallant.com/2012/10/do-i-have-to-call-dispose-on-dbcontext/ – Jamie Twells Mar 20 '18 at 16:48
1

Removing all Telerik Kendo MVC references (dll and such) fixed our problems. If we run the application without, all our memory problems are gone and we see normal memory use.

Basically: it was an external library causing high memory use.

Kevin
  • 775
  • 2
  • 13
  • 32