I have a very simple PowerShell script that uploads a generated test file to an AWS S3 bucket from a Windows 2008 R2 Datacenter server (clean AWS instance). If I run the script remotely on the server using Terraform (remote-exec
provisioner), the script fails on the S3 upload with a StackOverflowException
. When I run the script directly on the server, it runs fine and uploads the file.
I've experimented with different sizes for the file and 14.5MB seems to be about the maximum that works before the StackOverflowException occurs. Just about any size works fine when I RDP into the server and run the script directly. I've tested 200MB and it works fine.
Any idea why this is happening or what I can do to fix it? The actual file I need to upload is 50MB.
Here are the essential parts to recreate the problem. terraform.tf
file:
resource "aws_instance" "windows" {
count = "1"
ami = "ami-e935fc94" #base win 2008 R2 datacenter
instance_type = "t2.micro"
connection {
type = "winrm"
user = "<username>"
password = "<password>"
timeout = "30m"
}
provisioner "file" {
source = "windows/upload.ps1"
destination = "C:\\scripts\\upload.ps1"
}
provisioner "remote-exec" {
inline = [
"powershell.exe -File C:\\scripts\\upload.ps1"
]
}
}
The PowerShell script is very simple. upload.ps1
:
$f = new-object System.IO.FileStream C:\Temp\test.dat, Create, ReadWrite
$f.SetLength(40MB) # change this to 14.5MB and it works!
$f.Close()
Write-S3Object -BucketName "mybucket" -Folder "C:\Temp" -KeyPrefix "20180322" -SearchPattern "*.dat"
The error that I receive when launching the script from Terraform (remote-exec
provisioner):
aws_instance.windows (remote-exec): Process is terminated due to StackOverflowException.
Running upload.ps1
from RDP on the server itself works fine, including larger files (tested up to 200MB).
Here is the version information:
Microsoft Windows Server 2008 R2 Datacenter
Powershell Version: 3.0
AWS Tools for Windows PowerShell, Version 3.3.245.0
Amazon Web Services SDK for .NET, Core Runtime Version 3.3.21.15