3

It starts w/the proverbial:

[Notes - F1 [107]] Error: An error occurred with the following error message: "System.OutOfMemoryException: Insufficient memory to continue the execution of the program. (SSIS Integration Toolkit for Microsoft Dynamics 365, v10.2.0.6982 - DtsDebugHost, v13.0.1601.5)".

But even in it's own diagnostics, it shows that plenty of memory is available (yes, that's 32GB I have on my system):

Error: The system reports 47 percent memory load. There are 34270687232 bytes of physical memory with 18094620672 bytes free. There are 4294836224 bytes of virtual memory with 981348352 bytes free. The paging file has 34270687232 bytes with 12832284672 bytes free.

The info messages report memory pressure:

Information: The buffer manager failed a memory allocation call for 506870912 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

I currently have the max rows set at 500 w/the buffer size at 506,870,912 in this example. I've tried the maximum buffer size, but that fails instantly, and the minimum buffer size still throws errors. I've fiddled w/various sizes, but it never gets anywhere close to processing the whole data set. The error I get when I set the DefaultBufferSize lower is:

[Notes - F1 [107]] Error: An error occurred with the following error message: "KingswaySoft.IntegrationToolkit.DynamicsCrm.CrmServiceException: CRM service call returned an error: Failed to allocate a managed memory buffer of 536870912 bytes. The amount of available memory may be low. (SSIS Integration Toolkit for Microsoft Dynamics 365, v10.2.0.6982 - DtsDebugHost,

I've looked for resources on how to tune this, but cannot find anything relevant to having a 64bit Window 10 machine (as opposed to a server) that has 32GB of RAM to play with.

For a bit more context, I'm migrating notes from one CRM D365 environment to another using Kingsway. The notes w/attachments are the ones causing the issue.

Properties:

Properties

Execution

Execution

Source

Source

Destination

Destination

Phrozt
  • 103
  • 2
  • 11
  • 1
    Five hundred and six megabytes is a very large memory allocation, irrespective of the program that's making the request. I would be asking myself why such a large request is occurring. There are more constraints on memory than just total free space. For one thing, memory allocations must be contiguous, and there's no guarantee that at any given moment the operating system will have a block of memory available on the heap that is that large. `int` indexes only go up to about 2 gigabytes. Consider filing a bug with Microsoft. – Robert Harvey Nov 01 '18 at 21:38
  • Also, re-reading you're question, SSIS is crashing precisely because you requested such a large buffer size. Maybe you don't need a buffer that large? Try 1 megabyte and see what happens. If it's still asking for a half gig, restart SSIS; failing that, restart the OS. – Robert Harvey Nov 01 '18 at 21:41
  • 1
    See also https://stackoverflow.com/q/7393027, which explains that it might actually be the SQL Server that is at fault. – Robert Harvey Nov 01 '18 at 21:42
  • Running in Visual Studio or launched from a command line/SQL server? What does that data flow itself look like? – billinkc Nov 01 '18 at 21:50
  • Robert, I mentioned that I tried the smallest DefaultBufferSize and I get the last error. It seems as though something is not releasing the memory, because even when I have everything set low - buffer size, max rows, batch size, threads used - it still eventually gives me the memory error at some point. Also, I have seen the link you have in your third comment there. The source is CRM online and the destination is a different CRM online instance. I don't have control over either one. – Phrozt Nov 01 '18 at 23:34
  • Working on getting screenshots of the actual execution, but for now: ![DTS Properties](https://imgur.com/J7sNXBw). ![Source](https://imgur.com/U5bhalD). ![Dest](https://imgur.com/WB3o3BZ). – Phrozt Nov 01 '18 at 23:51
  • Here's a shot of the execution. For some context, there should be around 32K notes in the source. Most will be duplicates, because usually this package gets to where it was last time, gets another few thousand in, and then fails out w/a memory error. Each Note *has* a file in it (I parsed out all notes w/out a file in a different DTF), but the vast majority are under 1MB in size. Only 880 are larger than 1MB ![Execution](https://i.imgur.com/XyVFYa0.png) – Phrozt Nov 02 '18 at 00:57
  • Is DevEnv running in 32 bit or 64 bit mode? Have you set `Run64BitRuntime` to True? – Nick.Mc Nov 02 '18 at 04:02
  • 1
    I've been running the debugger in 64 bit, but since I'm running from VS, I think it's defaulting to 32bit anyway. I'm currently looking for some way to cap the memory usage so it doesn't go over that threshold. There are only a few files that are 40-50MB, so there's no reason to have to use all of the available memory. I'm also currently trying to run it as slow/low as possible.... I don't care about the run time because this is a 1 time migration, I just need it to complete. – Phrozt Nov 02 '18 at 06:19
  • When you say you try to set the buffer to a lower size, look at the error: _Failed to allocate a managed memory buffer of 536870912 bytes_ - it's still trying to allocate a big chunk of memory (more in fact). You need to experiment and find out how to tell it to use a smaller buffer because at the moment it isn't. – Nick.Mc Feb 18 '19 at 22:56

2 Answers2

1

I have had this problem before and it was not the physical memory (i.e., RAM), but the physical disk space where the database is stored. Check to see what the available hard drive space is on the drive that stores both the database and transaction log files - chances are that it is full and therefore unable to allocate any additional disk space.

In this context, the error message citing 'memory' is a bit misleading.

UPDATE

I think this is actually caused by having too much data in the pipeline buffer. You will need to either need to look at expanding the buffer's memory allocation (i.e., DefaultBufferSize) or you will need to take a look at what data is flowing through the pipeline. Typical causes can be a lot of columns with large NVARCHAR() character counts. Copying the rows with MultiCast will only compound the problem. With respect to the 3rd party component you are using, your guess is as good as mine because I have not used them.

J Weezy
  • 3,507
  • 3
  • 32
  • 88
0

For anyone coming along later:

The error says "CRM service call returned an error: Failed to allocate a managed memory buffer of 536870912 bytes". I understood it to be the CRM Server that had the memory issue.

Regardless, we saw this error when migrating email attachments via the ActivityMimeAttachment entity. The problem appeared to be related to running the insert to the target CRM with too large a batch size and/or multi-threaded.

We set the batch size to 1 and turned off the multi-threading and the issue went away. (We also set the batch size to 1 on the request from the source - we saw "service unavailable" errors from an on-premise CRM when the batch size was too high and the attachments were too large.)

mwardm
  • 1,953
  • 15
  • 19