0

I would like to create an Azure task that makes an HTTP GET request to an API every 5 minutes (let's say). The file that is returned as a result of this request, will be stored in a storage container. However, I do not know what's the best path to do this.

What I have tried:

  • created a storage container in the Azure Portal (for receiving the returned file)
  • tried creating a pipeline in the Azure devops. Not sure where to add this task here.
  • read up on Azure resources to see how to create a task or batch API.

Problem is that I'm new to Azure and do not yet understand the architecture of how this will all work out and so I am not even sure what keywords to google.

What's the best way to do this Azure? Is this done in Azure devops or Azure portal? Which is the better option? And lastly, and most importantly, could you provide a simple example HTTP GET task that would work in the Azure environment?

learnerX
  • 1,022
  • 1
  • 18
  • 44
  • Hi, is there any update for this issue? Could `Krzysztof Madej`'s answer helps to resolve your issue?If yes, please consider accepting it as answer. And please feel free to let me know if you're still blocked. – LoLance Sep 08 '20 at 09:24
  • almost there, except the part where I need to redirect the file returned (as response to function call) to the storage container. I looked at the output bindings but only been able to write string values to the storage container so far. I need to write the file to the storage container somehow using powershell – learnerX Sep 09 '20 at 02:07
  • Maybe [AzCopy](https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10) or [az storage file copy](https://learn.microsoft.com/en-us/cli/azure/storage/file/copy?view=azure-cli-latest#az-storage-file-copy-start) via command-line? – LoLance Sep 09 '20 at 02:14
  • I think `azcopy` and `az storage file copy` will require me to save the file first? And then copy this saved file to the storage container? In that case, does azure allow a function app to do an `OutFile` to save to temp storage? – learnerX Sep 09 '20 at 02:17
  • Yes,these two tools need us to save the files first. Hmm, but I'm not sure if this is allowed in Azure since I'm more familiar with Azure Devops... – LoLance Sep 09 '20 at 02:19

2 Answers2

1

You should be able to handle this using scheduled triggers

# YAML file in the master branch
schedules:
- cron: "*/5 * * * *"
  displayName: Run every 5 mins
  branches:
    include:
    - master

And you can use this task or juts call rest API over powershell

Invoke-RestMethod -Uri 'https://cat-fact.herokuapp.com/facts' -Headers @{ 'Authentication' = 'Bearer xxxxxxxxxxxxxxxx'  }

However I would recommend you use Azure Functions for that purpose. You can use there timer trigger and than you can use various of languages to make a REST API call and you will also get integration with Azure Blob making saving responses very easy.

Assuming you have define bindings as follows:

{
    "bindings": [
        {
            "authLevel": "function",
            "type": "httpTrigger",
            "direction": "in",
            "name": "req",
            "methods": [
                "get",
                "post"
            ]
        },
        {
            "type": "blob",
            "name": "inBlob",
            "direction": "in",
            "dataType": "binary",
            "path": "samples-input/text.zip"
        },
        {
            "type": "blob",
            "name": "outBlob",
            "direction": "out",
            "path": "samples-output/string.txt"
        }
    ]
}

and then

push-outputbinding -name outBlob -value $someValue

Your function fail if it won't be able to save in storage account. So you don't need any specific code to test if content was saved or not.

Krzysztof Madej
  • 32,704
  • 10
  • 78
  • 107
  • thanks. I made an Azure function in Powershell that uses the `Invoke-RestMethod` and `OutFile` to save the response to a file that I name `test.xml`. The function runs with no error (also tested on my local machine). However, how can I verify this file was successfully saved in the function's related storage container? – learnerX Sep 04 '20 at 03:36
  • Please check output bindings here https://learn.microsoft.com/pl-pl/azure/azure-functions/functions-reference-powershell?tabs=portal – Krzysztof Madej Sep 04 '20 at 05:48
  • I succesfully wrote a string value using the `push-outputbinding` and `value`. However, how do I push the actual returned file here? Looks like `value` only accepts strings? – learnerX Sep 04 '20 at 14:51
  • I just edited my answer. It looks that you may have an issue with putting stream into blob https://github.com/Azure/azure-functions-host/issues/4264 – Krzysztof Madej Sep 04 '20 at 14:55
  • https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell?tabs=portal please read your file as a byte array and put that as value – Krzysztof Madej Sep 04 '20 at 14:59
1

The whole process:

1.As Krzysztof Madej suggested, you can use yaml pipeline with - cron: "*/5 * * * *" to run the pipeline every five minutes. But according to your description, you also should enable always: true so that your pipeline can run even there's no changes in source repo.

2.Also, to use the schedule in that format, you need to disable any CI trigger or PR trigger.

If you want to run your pipeline by only using scheduled triggers, you must disable PR and continuous integration triggers by specifying pr: none and trigger: none in your YAML file.

3.Apart from the schedule(every 5 minutes), you also need a Powershell task to call the rest api. Get the response and then write the response to newly created json file.(Or text file)

4.After that, you could use Azure File Copy task to upload the file that contains the response to Azure Storage Container.

Here's my minimal working example:

pool:
  vmImage: 'windows-latest'

schedules:
- cron: "*/5 * * * *"
  displayName: Run every 5 mins
  branches:
    include:
    - master
  always: 'true'

steps:
- task: PowerShell@2
  inputs:
    targetType: 'inline'
    script: |
      #Call the rest api.
      $url = "$($env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI)_apis/projects/$env:SYSTEM_TEAMPROJECTID/teams?api-version=5.1"
      $response = Invoke-RestMethod -Uri $url -Method Get -Headers @{
          Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
      }
      #Get the response and then pass it into one spcific file.
      write-host $($response | ConvertTo-Json -Depth 100)
      $response | ConvertTo-Json -depth 100 | Out-File "$(System.DefaultWorkingDirectory)\Backup-$(Build.BuildId).json"
  env:
      SYSTEM_ACCESSTOKEN: $(System.AccessToken)

- task: AzureFileCopy@4
  inputs:
    SourcePath: '$(System.DefaultWorkingDirectory)\Backup-$(Build.BuildId).json'
    azureSubscription: 'xxx'
    Destination: 'AzureBlob'
    storage: 'xxx'
    blobPrefix: 'xxx'
    ContainerName: 'xxx'

In addition:

Here's similar topic about how to backup using AzureFileCopy. And you can refer to this link if you meet permission issue with AzureFileCopy task.

LoLance
  • 25,666
  • 1
  • 39
  • 73