SQL Server offers bulk insert functionality. You can see that this file reads from e.g. a csv
file and inserts to table.
I am understanding that this has a clear drawback when working with Kafka:
- you would have to take the kafka message and transform it to CSV
- you would have to take the kafka message, and after the transformation in the previous step, write it to disk, so that the BULK INSERT can access the file.
My question is about how to overcome the above drawbacks; something about this whole process looks wrong. What is most worrying to me is the 2nd drawback, writing to disk. Would I be able to write a file to memory, and then execute bulk insert over it?