I currently have a pipeline setup as such:
BufferBlock ==> BatchBlock ==> TransformBlock ==> TransformManyBlock ==> BufferBlock
The BoundedCapacity
is set to 2,000. However, if I provide a single item to the BufferBlock, and then call the Complete
method on the BufferBlock, it never completes. I tried awaiting the Completion
task of the incoming BufferBlock, as well as the Completion
task of the outgoing (last) BufferBlock, but it never completes.
My understanding was that calling the Complete
method on the first block in the pipeline should indicate that no more data is going to be offered and to process any remaining items through the pipeline. When I add a breakpoint on the await incomingBuffer.Completion
line, and check the properties of incomingBuffer
via Visual Studio, I see the following:
As seen, it is indeed linked to a target, and is no longer accepting new items, but it is not completed and there is still one item in the Queue
. How can I force the BufferBlock to send the remaining items to the BatchBlock when there are no more items being offered. I should point out that when I link the various action blocks (with the LinkTo
method) I always pass in a new DataflowLinkOptions
class with PropagateCompletion = true
.
Also if I set BoundedCapacity
to 1
(for every block) then everything works fine. However, in production this pipeline will be processing over 1.5M records, in batches of 1,000, so I don't think having a BoundedCapacity
of 1
would work (unless I completely misunderstood its functionality). I also attempted to not even set the BoundedCapacity
property on the DataflowBlockOptions
class (so it would just use its default value) and this did not work either. I have to explicitly set the BoundedCapacity
value to 1
for the pipeline to complete when there is an item in the first BufferBlock; otherwise it just sits there and nothing is completed.
Any help or insight would be greatly appreciated.