46

I have a running service that gets 50-100 queries per minute. And these are not high cost queries. This service has been running for around 3-4 months without any errors.

Suddenly few days ago it started giving "There is insufficient system memory in resource pool 'default' to run this query." error occasionally. When I investigate the problem I saw that sqlservr.exe was using ~1.5 gb ram and %25 of CPU(all of 1/4CPU). And when I restarted the sqlservr.exe the ram starts from ~50mb and slowly increase till it becomes ~1.5gb then leads to crashes in the apps using it.

After I have made little bit of research I figured that it is caused by the edition of sql server I use. It was express edition limiting the numbers to those. So I have upgraded my sql server from '2008r2 express' to '2012 enterprise'. When I started the service I thought my problems are finally over, since the service uses only ~60mb of memory, but in an hour unfortunately same problem started occurring, but this time the used memory I see on windows task manager is still ~60mb, not excessing any limits.

I use EntityFramework as ORM in a wcf service. And along with it I have SqlQueryNotification(broker and stuff) system for some caching operations.

Am I missing some crucial configuration points? Or 6gbs of memory and my 4 CPU is really not enough for this? But it can't be that because same load was like that for 3 months and there wasn't any error back then, and there is not any change of codes either.

ram allocated

ram usage

ROMANIA_engineer
  • 54,432
  • 29
  • 203
  • 199
Tolga Evcimen
  • 7,112
  • 11
  • 58
  • 91
  • 8
    You simply upgraded from Express to Enterprise? For free? Where can I get this amazing deal?! – DavidG Aug 12 '14 at 14:22
  • show your code, you must be exhausting resources on server by not cleaning up connections??? – T McKeown Aug 12 '14 at 14:23
  • 3
    If you hava company, microsoft offers it with bizspark program. I suggest you to check it. – Tolga Evcimen Aug 12 '14 at 14:24
  • What is MaxMemory for SQL Server set to? Is this 32-bit or 64? – David Crowell Aug 12 '14 at 14:25
  • I am using `Using` statements when I am going to connect to db everytime. So I am sure I am not letting any connection to stay open. And my system is 64 bit. And 6144mb is the Maximum server memory, which I set from server properties. – Tolga Evcimen Aug 12 '14 at 14:28

4 Answers4

35

SQL Server will start with as much memory as needed, then slowly ramp up until it uses all allocated to it in the Server Properties:

enter image description here

It will not release any of this memory until the service is restarted; this is by design.

It is generally recommended to leave 2ish GB for the OS, and you need to be mindful of any other processing running on the same server as SQL. It is usually recommended to have SQL Server on it's own server without anything else running there.

That said, 1.5 GB of RAM for SQL Server isn't that much. If you don't have more available, it may be time to add some or upgrade the server.

See also: Technet, Brent Ozar

Dave.Gugg
  • 6,561
  • 3
  • 24
  • 43
  • 2
    No no it's not the case. On this machine I have 12gb ram right now, and I have a lot of memory available. I can increase the memory easily anytime. And the funny thing is now sqlervr.exe doesn't seem to be using more than 70MB ever. – Tolga Evcimen Aug 12 '14 at 14:32
  • 1
    But it still crashes with that memory exception in around 1 hour. – Tolga Evcimen Aug 12 '14 at 14:33
  • @TolgaEvcimen are there other programs running on this server? How much physical memory does resmon.exe say you are using? – Dave.Gugg Aug 12 '14 at 14:34
  • @TolgaEvcimen how much memory do the sys.dm_os_sys_info and sys._dm_os_performance_counters indicate sql is using and how much is available? – Dave.Gugg Aug 12 '14 at 14:38
  • ummm, I guess I don't know how to check them. Can you show me the way? – Tolga Evcimen Aug 12 '14 at 14:38
  • To check resmon.exe, just type resmon into the search on the windows start menu. This will give more details on the ram being used. The other two are dynamic management views, and can be queried in SQL Server as normal: select * from sys.dm_os_sys_info. – Dave.Gugg Aug 12 '14 at 14:41
  • 2
    Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/59200/discussion-between-dave-gugg-and-tolga-evcimen). – Dave.Gugg Aug 12 '14 at 14:42
  • I just installed SQL Server 2014 on my machine. This `maximum server memory (in MB)` setting is currently set to `2147483647` by default. This comes out to be 2097151 GB of RAM. Is this value even practical by any means for a database server box? – RBT Oct 04 '16 at 06:49
  • "It will not release any of this memory until the service is restarted; this is by design." " - that statement is incorrect. Try running on a VM without a host memory reservation set.... – Mitch Wheat Dec 21 '16 at 04:54
12

I am posting this answer because someone may find it useful.

You can set max server memory even with this query:

sp_configure 'show advanced options', 1;  
GO  
RECONFIGURE;  
GO  
sp_configure 'max server memory', 32768;   -- for 32 GB
GO  
RECONFIGURE;  
GO

Also you can use this query to release memory without restarting the service simply by setting to 2 GB (2048) and changing again back to 32 GB (32768) or to the value you want.

Please, do not set it below 1 GB because it will cause errors executing queries and therefore you will need to restart the service to be able to execute queries, even this query to increase memory again.

Besnik Kastrati
  • 819
  • 8
  • 6
  • Be careful with this command, you still might need to **restart the service** despite what the author claims, it just happend to me on a productive system. – Damian Vogel Feb 18 '21 at 10:59
  • Any ideas how to test beforehand whether a given instance will not faint after decreasing `max server memory`? – Anton Shepelev Jul 04 '22 at 07:38
7

In our case it was because of Memory Optimized table types, with huge amount of data. There was multiple calls to different stored procedures at the same time and each using the same table type and loading huge amount of records in it (>100,000). For our application, there was way to reduce the number of records inserted in to memory optimized table type i.e. instead if storing all selected items in a memory optimized table type we conditionally stored only the non-selected records.

CREATE TYPE [TestType] AS TABLE (
    [TestOrder]    VARCHAR (1500)    NOT NULL,
    [TestDepartment] UNIQUEIDENTIFIER NULL,
    [TestCourse] UNIQUEIDENTIFIER NULL,
    [TestStudent] UNIQUEIDENTIFIER NULL,
    INDEX IX_Test NONCLUSTERED (TestOrder))
    WITH (MEMORY_OPTIMIZED = ON);
Baga
  • 1,354
  • 13
  • 24
  • I inserted lots of data (around 1 million) to the table type from SP, and then caused this problem. Could you tell me how you skipped inserting data? – Jason Li Sep 17 '20 at 15:32
1

In my case this was failing only for specific queries. So I changed "max memory per query to 2048" and it started working.

Ashish
  • 277
  • 3
  • 13
  • 3
    There is no option called "maximum memory per query", I believe you are referring to "minimum memory per query" – Ajay Meda Mar 28 '23 at 21:37