0

I have an azure static web app. I want to upload a daily generated *.CSV file from local, convert it to *.json format with azure functions (daily aswell), store it and then pass it to the static web app on http trigger request.

How can I upload the *.CSV file directly from local to be accessible by the function? (to the function directory!?) Where to store the *.json?

Chris H.
  • 3
  • 3
  • You could always use the FTP approach. If you're keen on that, I can post an answer but won't bother if you're not. – Skin Mar 28 '22 at 10:44

2 Answers2

0

pass it to the static web app on http trigger request

If you want to post it to your Web App You can directly send your JSON content to your web app, here is one of the workaround where you can use Blob trigger and post the data to your web app using HttpWebRequest. Below is the code that worked for me

using System;
using System.Globalization;
using System.IO;
using System.Text;
using Azure.Storage.Blobs;
using CsvHelper;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System.Linq;
using System.Threading.Tasks;
using System.Net.Http;
using System.Net;

namespace FunctionApp
{
    public static class Function1
    {
        [FunctionName("Function1")]
        public static void Run([BlobTrigger("samples-workitems/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");

            var json = Convert(myBlob);
            PostJSON(json);           
        }

        public static string Convert(Stream blob)
        {
            var csv = new CsvReader(new StreamReader(blob), CultureInfo.InvariantCulture);
            csv.Read();
            csv.ReadHeader();
            var csvRecords = csv.GetRecords<object>().ToList();

            return JsonConvert.SerializeObject(csvRecords);
        }

        public static async Task PostJSON(string json)
        {
            var request = (HttpWebRequest)WebRequest.Create("<Your Web App URL>");

            var data = Encoding.ASCII.GetBytes(json);

            request.Method = "POST";
            request.ContentType = "application/json";
            request.ContentLength = data.Length;

            using (var stream = request.GetRequestStream())
            {
                stream.Write(data, 0, data.Length);
            }

            var response = (HttpWebResponse)request.GetResponse();

            var responseString = new StreamReader(response.GetResponseStream()).ReadToEnd();
        }
    }
}

RESULTS:

Considering this to be my CSV file

enter image description here

In my selected API

enter image description here

How can I upload the *.CSV file directly from local to be accessible by the function? (to the function directory!?) Where to store the *.json?

You can use blob trigger function in this case, where every time you upload csv file to your storage account it picks the data converts into json file and then stores the same into another container. Please refer the below code for the same

using System;
using System.Globalization;
using System.IO;
using System.Text;
using Azure.Storage.Blobs;
using CsvHelper;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System.Linq;

namespace FunctionApp
{
    public static class Function1
    {
        [FunctionName("Function1")]
        public static void Run([BlobTrigger("samples-workitems/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");

            var json = Convert(myBlob);
            var fileName = name.Replace(".csv", "");
            CreateJSONBlob(json, fileName);           
        }

        public static string Convert(Stream blob)
        {
            var csv = new CsvReader(new StreamReader(blob), CultureInfo.InvariantCulture);
            csv.Read();
            csv.ReadHeader();
            var csvRecords = csv.GetRecords<object>().ToList();

            return JsonConvert.SerializeObject(csvRecords);
        }

        public static void CreateJSONBlob(string json, string fileName)
        {
            var c = new BlobContainerClient(Environment.GetEnvironmentVariable("AzureWebJobsStorage"), "jsoncontainer");
            byte[] writeArr = Encoding.UTF8.GetBytes(json);

            using (MemoryStream stream = new MemoryStream(writeArr))
            {
                c.UploadBlob($"{fileName}.json", stream);
            }
        }
    }
}

RESULTS:

enter image description here

enter image description here

enter image description here

REFERENCES: Azure Function that converts CSV Files to JSON

SwethaKandikonda
  • 7,513
  • 2
  • 4
  • 18
  • Thank you, is there a JavaScript/ Typescript solution for this? I might need to do changes or combine multiple CSV but don't know C# – Chris H. Apr 04 '22 at 19:20
0

Example C# Code, I imagine there would be something similar in your preferred language (you haven't mentioned your language here)

public IActionResult CsvToJsonFunction(
        [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "/csv/upload")] HttpRequest request)
        {
            var csvFile = request.Form.Files[0]; //csvFile is an IFormFile type
            //process your csv Files
            var json = CsvToJsonConverter(csvFile);
            // another method that uploads to blob (see below)
            return new OkObjectResult(json);
        }

to convert IFormFile to a byte[]/stream, see - https://stackoverflow.com/a/36433458/9276081

How to send files to your endpoint is a different matter, for simplicity sake, you can use Postman to upload files, see tutorial (starts at 20s) - https://youtu.be/S7bwkys6D0E?t=20

Or you can create a small WPF app to do the same.

Now if you want to store the json somewhere, you can use blobs. Uploading to blobs via an http trigger would be overkill unless you're building a fully fledged application. In which case, you can use the Blob related nuget packages and follow some tutorials (following will also work for Azure Function) -

https://learn.microsoft.com/en-us/answers/questions/130134/upload-imagesfiles-to-blob-azure-via-web-api-aspne.html

https://tutexchange.com/uploading-download-and-delete-files-in-azure-blob-storage-using-asp-net-core-3-1/

If you want the most simplest solution without any coding -

(1) Use an online CSV to JSON converter

(2) Use Azure Storage Explorer to upload the converted file (https://azure.microsoft.com/en-in/features/storage-explorer/#overview)

It's extremely easy to use.

You can go 1 step further and design your web app to directly read csv file (possible in nearly all languages js/php etc) and you dont need to convert at all, and just use the storage explorer to upload a csv to the correct blob container.

Bandook
  • 658
  • 6
  • 21