5

In Haskell, I'm processing some data via conduits. During that processing, I want to conditionally store that data in S3. Are there any S3 libraries that will allow me to do this? Effectively, what I want to do is "tee" the pipeline created by the conduit and put the data it contains on S3 while continuing to process it.

I've found the aws library (https://hackage.haskell.org/package/aws), but the functions like multipartUpload take a Source as an argument. Given that I'm already inside the conduit, this doesn't seem like something I can use.

jyurek
  • 1,099
  • 1
  • 9
  • 15

2 Answers2

2

There is now a package—amazonka-s3-streaming—that exposes a multi-part upload to S3 as a conduit Sink.

Jason Whittle
  • 751
  • 4
  • 23
1

This is not really an answer, but merely a hint. amazonka seems to expose RequestBody of requests from http-client. So in theory it's possible to pipe data there from conduits. Yet seems that you have to know digest of the data beforehand.

So does tell Can I stream a file upload to S3 without a content-length header? too.

Community
  • 1
  • 1
phadej
  • 11,947
  • 41
  • 78
  • Thanks for the link to the other question. As for amazonka, I believe I saw that a while ago, but it relies on lens. It (IIRC) is a really large dependency (as in, it added many minutes to compilation), so I disregarded it. – jyurek Aug 19 '15 at 12:25