4

I have just restored a .bacpac file into a local SQL server instance (64b v12.0.4213), the backup is from an azure sql instance.

It failed a few times with an OOM exception. I switched off everything on my machine and by the end of the restore the SQL server service instance was consuming 13GB of memory from a 700MB file!

The restore luckily finished, but it seems the memory is not being freed up/garbage collected. It's still sitting at 12GB as I write this.

Is it a known issue? Is there any way I can restore a .bacpac and select a table to ignore? You can to do this with a normal data restore, the most offensive table was a dbo.[Logs] table, obvs.

tigerswithguitars
  • 2,497
  • 1
  • 31
  • 53
  • Have you tried sqlcmd instead of ssms? Although it's quite strange for such small database, but ssms is known to have problems with large queries, database restores, etc. "but it seems the memory is not being freed up/garbage collected" - the memory used by SQLServer or SSMS? – Alex Yu Jan 07 '16 at 11:04
  • SQLServer rather than SSMS, it's associated in win perf tool with the SQLServer service instance. – tigerswithguitars Jan 07 '16 at 11:08
  • 1
    SQLServer is memory hungry - when he gets memory, he almost never give it back (and for good reasons). About your problem: not your situation exactly, but have you tried workarounds from https://support.microsoft.com/en-us/kb/2874903 ? – Alex Yu Jan 07 '16 at 11:14

4 Answers4

7

I had the same issue; amending the memory available to the server had no impact.

For me the resolution was to use the command line (PowerShell) to perform the import.

[string]$myBacpac = 'c:\temp\myBacpac123.bacpac'
[string]$connectionString = 'Data Source=.;Initial Catalog=MyNewCatalog; Integrated Security=true;'
[string]$action = 'Import'

[string[]]$commandParameters = @(
    "/Action:`"$action`"" 
    "/SourceFile:`"$myBacpac`"" 
    "/TargetConnectionString:`"$connectionString`""
)
[string]$LatestSqlPackage = Get-Item 'C:\*\Microsoft SQL Server\*\DAC\bin\sqlpackage.exe' | %{get-command $_}| sort version -Descending | select -ExpandProperty source -First 1
if ($LatestSqlPackage) {
    Write-Verbose "Found: $LatestSqlPackage" 
    & $LatestSqlPackage $commandParameters
} else {
    Write-Error "Could not find SqlPackage.exe"
}

On my first attempt I received an error regarding an unsupported model version:

Importing to database 'MyNewCatalog' on server '.'. Creating deployment plan Initializing deployment SqlPackage.exe : * Error importing database:Could not read schema model header information from package. At line:1 char:1 + & $sqlPackage /Action:Import /SourceFile:"c:\temp\myBacpac123.bacpac" /T ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (* Error impor...n from package.:String) [], RemoteException + FullyQualifiedErrorId : NativeCommandError The model version '3.5' is not supported.

For that error I followed the guidance here: https://stackoverflow.com/a/40541210/361842; i.e. installed Microsoft SQL Server Data-Tier Application Framework (16.4). On rerunning all was successful.

JohnLBevan
  • 22,735
  • 13
  • 96
  • 178
  • 1
    This should be the accepted answer, also I downloaded the latest version of DacFx from https://github.com/microsoft/DacFx – rfcdejong Aug 12 '22 at 15:07
2

To configure SQL Server's use of memory, open SQL Server Management Studio, connect to the server, right-click on the server in the Object Explorer window, click properties, and then click on the Memory tab of the Server Properties window.

As for the bacpac, you can't select which tables to restore during an import operation, but you can select which tables are exported. You can use SqlPackage.exe's export command with the /p:TableData parameter to specify which tables should be included in the bacpac. There's unfortunately no way to just specify which tables should be excluded. =^/

SqlPackage.exe documentation is available here: https://msdn.microsoft.com/en-us/hh550080(v=vs.103).aspx

Steven Green
  • 3,387
  • 14
  • 17
  • Great input. The team is moving the logs out of the database, hallelujah, so I think I will update the export to not include them. If the worst happens, they are available on logentries going forward and in historical backups. Thanks. – tigerswithguitars Jan 08 '16 at 17:01
0

Neither of the other answers worked for me, what did work was closing and restarting SSMS. This sounds like a silly suggestion, but I'd previously been running some large queries which must've caused memory issues.

WheretheresaWill
  • 5,882
  • 7
  • 30
  • 43
0

update ssms to the latest version will fix this issue

midhun
  • 1
  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Aug 03 '23 at 07:16