3

I am looking at suggestions on how to tackle this and whether I am using the right tool for the job. I work primarily on BizTalk and we are currently using BizTalk 2013 R2 with SQL 2014.

Problem:

We would be receiving positional flat files every day(around 50) from various partners and the theoretical total number of records received would be over a million records. Each record has some identifying information that will need to be sent to a web service which would come back essentially with a YES or NO based on which the incoming file is split into two files.

Originally, the scope for daily expected records was 10k which later ballooned to 100k and now is at a million records.

Attempt 1: Scatter-Gather pattern

I am debatching the records in a custom pipeline using the file disassembler, adding a couple of port configurable properties for the scatter part(following Richard Seroter's suggestion of implementing a round-robin assignment) where I control the number of scatter/worker orchestrations I spin up to call the web service and mark the records to be sent to 'Agency A' or 'Agency B' and finally push a control message that spins up the Gather/Aggregator orchestration that collects all the messages that are processed from the workers into the messagebox via correlation and creates two files to be routed to Agency A and Agency B.

So, every file that gets dropped will have it's own set of workers and a aggregator that would process the file.

This works well for files with fewer number of records but if a file has over 100k records, I see throttling happen and the file takes a long time to process and generate the two files.

I have put the receive location/worker & aggregator/send port on separate hosts. It appears to be that the gatherer seems to be dehydrated and not really aggregating the records processed by the workers until all of them are processed and i think since the ratio of msgs published vs processed is very large, it is throttling.

Approach 2:

Assuming that the Aggregator orchestration is the bottleneck, instead of accumulating them in an orchestration, i pushed the processed records to a SQL db and 'split' the records into two XML files(basically a concatenate of msgs going to Agency A/B and wrapping it in XML declaration and using the correct msg type based on writing some of the context properties to the SQL table along with the record). These aggregated XML records are polled and routed to the right agencies.

This seems to work okay with 100k records and completes in an acceptable amount of time. Now that the goal post/requirement has again changed with regard to expected volume, i am trying to see if BizTalk is even a feasible choice anymore.

I have indicated that BT is not the right tool for the job to perform such a task but the client is suggesting we add more servers to make it work. I am looking at SSIS.

Meanwhile, while doing some testing, some observations:

  1. Increasing the number of workers improved processing(duh): It looks like if each worker processed a fewer number of records in it's queue/subscription, they finished their queue quickly. When testing this 100k record file, using 100 workers completed in under 3 hrs. This is with minimal activity on the server from other applications. I am trying to get the web service hosting team to give me a theoretical maximum no of concurrent connection they can handle. I am leaning towards asking them to see if they can handle 1000 calls and maybe the existing solution would scale with my observations.

I have adjusted a few settings for the host with regard to message count and physical memory threshold so it won't balk with the volume but I am still unsure. I didn't have to mess with these settings before and can use advice to monitor any particular counters.

The post is a bit long but I am hoping this gives an idea on what I did so far. Any help/insight appreciated in tackling this problem. If you are suggesting alternatives, i am restricted to .NET or MS based tools/frameworks but would love to hear on other options as well.

I will try to answer or give more detail if you want to clarify or understand something I didn't make clear.

Dijkgraaf
  • 11,049
  • 17
  • 42
  • 54
Seige
  • 29
  • 4
  • The Throttling pattern of Seroter mentioned is one also one that we've used. See https://www.connected-pawns.com/2016/05/21/blast-from-the-past-biztalk-orchestration-throttling-pattern/ And it has worked, but yes, one of the limitations is the number of simultaneous connections. The other thing to look at is if the web service you are calling can be changes to accept more than one record at a time. As that is quite a limiting factor. Also have you looked at the max connections settings, which by default is 2 https://complicatedwords.wordpress.com/2009/02/16/biztalk-maxconnections/ – Dijkgraaf Jun 28 '17 at 07:21
  • Thanks, I've heard about this but i don't think I've hit this restriction. My dev machine where i am testing this is on Windows 2012 and when i spin up 10 workers as an example, i see 10 connections being made to the web service in Fiddler and not restricted to 2. I haven't seen timeouts happen. I'm also asking the web service team to bump up their concurrent connection limits, so i will be testing that out and see if performance increases. To be clear, have you faced or are handling such a scenario at your work place? – Seige Jun 28 '17 at 13:53
  • Yes, we have faced having to debatch a large file and then process it through a web service with a limited number of connections. It does take a while to process, by we don't get any throttling and timeouts anymore, which we were prior due to too many Orchestrations spinning up and trying to connect to the web service at the same time. Yours looks a bit more complicated in that you have to take all the responses back into one response which we didn't have to do. – Dijkgraaf Jun 28 '17 at 18:45
  • Did you identify which throttling counter got triggered in BizTalk. As that is the bottleneck you need to address. Did you see a lot of messages queued up for the send port? Then possibly your max connections is too low and you need to change that. – Dijkgraaf Jun 28 '17 at 21:37
  • Is your pipeline publish the debatched messages directly to Msgbox? If this is the case, when you processing a 10k record file, you will see 10k messages published to message box almost at same time. This might be the cause of your throttle. – Zee Jun 29 '17 at 02:28

3 Answers3

1

First, 1 million records/messages is not the issue, but you can make it a problem by handling it poorly.

Here's the pattern I would lay out first.

  1. Load the records into SQL Server with SSIS. This will be very fast.
  2. Process/drain the records into you BizTalk app for...well, whatever needs to be done. Calling the service etc.
  3. Update the SQL Record with the result.
  4. When that process is complete, query out the Yes and No batches as one (large) message each, transform and send.

My guess is the Web Service will be the bottleneck unless it's specifically designed for such a load. You will probably have to tune BizTalk to throttle only when necessary but don't worry about that just yet. A good app pattern is more important.

Johns-305
  • 10,908
  • 12
  • 21
  • My solution was a modification of your design but we've removed BT out of the picture since SSIS appeared to be a much better fit for the task and the client agreed based on the performance numbers. – Seige Apr 25 '18 at 02:41
0

In such scenarios, you should consider following approach:

  • De-batch the file and store individual records to MSMQ. You can easily achieve this without any extra coding effort, all you need is to create a send port using MSMQ adapter or WCF custom with netmsmq binding. If required, you can also create separate queues depending on different criteria you may have in your messages.
  • Receive the messages from MSMQ using receive location on a separate host.
  • Send them to web service on a different BizTalk host.
  • Try using messaging only scenarios, you can handle service response using a pipeline component if required. You can use Map on send port itself. In worst case if you need orchestration, it should only be to handle one message processing without any complex pattern.
  • You can again push messages back to two MSMQ for two different agencies based of web service response.
  • You can then receive those messages again and write them to file, you can simply use a send port with FileAppend option or use a custom pipeline component to write the received messages to file without aggregating them in orchestration. You can gather them in orchestration, if per file you don't have more than few thousand messages.
  • With this approach you won't have any bottleneck within BizTalk and you don't need to use complex orchestration pattern which usually end up having many persistent points.
  • If web service becomes a bottleneck, then you can control the rate of received message from MSMQ using 1) Ordered Delivery on MSMQ receive location and if required 2) using BizTalk host throttling by changing two properties Message Count in Db to a very low number e.g. 1000 from 50K default and increasing Spool and Tracking Data Multiplier accordingly e.g. 500 from 10 default to make sure the multiply of both number is enough for not to cause throttling due to messages within BizTalk. You can also reduce the number of worker threads on BizTalk host to make it little slow.
  • Please note MSMQ is part of Windows OS and does not require any additional setup. Usually installed by default, if not you can add using add-remove features. You can also use IBM MQ if your organization has the infrastructure. But for one million messages, MSMQ will be just fine.
Vikas Bhardwaj
  • 1,455
  • 8
  • 10
0

Apologies on the late update* We've decided to use SSIS to bulk import the file to a table and since the lookup web service is part of the same organization and network although using a different stack, they have agreed to allow us to call their lookup table upon which their web service is based on and we are using a 'merge' between those tables to identify 'Y' or 'N' and export them out via SSIS as well.

In short, we've skipped using BT. The time it now takes is within a couple of mins for a 1.5 million record file to be processed and send the split files.

Appreciate all the advice provided here.

Seige
  • 29
  • 4