0

There seems to be a parameter limit on a string when calling a WCF service.

There is multiple methods in the WCF service and most big data pass into methods with byte[] parameters. But i do have one that can receive very large string. for some reason it seems to block at around 26-28 mb. I get the following error but i know it's an incorrect error :

There was no endpoint listening at http://example.com/service/DataLayer.svc that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details."

the internal error only return the following :

The remote server returned an error: (404) Not Found.

The settings i have for the whole service and client are the same and are maxed out :

Client

binding = new BasicHttpBinding();
binding.CloseTimeout = new TimeSpan(0, 1, 30);
binding.OpenTimeout = new TimeSpan(0, 1, 30);
binding.ReceiveTimeout = new TimeSpan(0, 1, 30);
binding.SendTimeout = new TimeSpan(0, 1, 30);
binding.MaxReceivedMessageSize = 100000000;
binding.ReaderQuotas.MaxDepth = 2147483647;
binding.ReaderQuotas.MaxStringContentLength = 2147483647;
binding.ReaderQuotas.MaxArrayLength = 2147483647;
binding.ReaderQuotas.MaxBytesPerRead = 2147483647;
binding.ReaderQuotas.MaxNameTableCharCount = 2147483647;

Server

<basicHttpBinding>
  <binding maxBufferPoolSize="2147483647" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647">
    <readerQuotas maxDepth="2147483647" maxStringContentLength="2147483647" maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxNameTableCharCount="2147483647" />
  </binding>
</basicHttpBinding>

I can call a method with byte[] parameter that the size (saved to disk) is over 300 mb and it works. Also, fun fact, there is a method that i pass an integer that return the string that i am trying to send to the WCF.

So i went on the server and manually created a 40 mb file then called the method and pass the proper int so it return the proper file string and the file is 40 mb and it works like a charm.

This service has been running on the same server since 2014 without any issue and we recently started to send bigger string to that specific method and it started to throw errors.

I have copied the code that is in the WCF service into the exe application and the code works. So since both end code works my bet is on the transfer from the exe to the wcf. Also server still have 680 gb space left so i am green there too.

Here as asked the code i have from the web service

[ServiceContract]
public interface IDataLayer
{
    [OperationContract]
    void SaveText(string text, int id);
}
public class DataLayer : IDataLayer
{
        public void SaveText(string text, int id)
        {
            File.WriteAllText(@"C:\Data\" + id.ToString() + ".txt", text);
        }
 }

Edit : i tried with bringing everything local. So i ran the WCF Service locally in Visual Studio and attach my client to that one.

I put a break point in the SaveText method on the service side and i called the method from the client using a 40 mb string. The error still pops up in the client and never reaches the breakpoint in the service service. So the problem is in between the method call and before it gets to the service. I tried with a 5 mb string an the breakpoint hits.

I tried with a 159 mb byte[] and called the second method in the service SaveByte(byte[] data, int id) which all it does is save to a database and the break point hits. So the problem is really related to string that cannot be more than a certain amount.

Is there a way to go line by line in Microsoft underlying code and follow each method when calling a WCF service method until it is actually sent out to the server ? Like that i could find what is the real error.

Edit I have multiple case that i need multiple string to be passed as parameters that have huge size. The duplicated marked is wrong as it's for file only but im looking for solution for string not files. Files already work with byte[] parameters.

Franck
  • 4,438
  • 1
  • 28
  • 55
  • It would be awesome if you could provide a [mcve]. – mjwills Feb 13 '18 at 20:18
  • @mjwills yes very easy it's 1 line of code i'm adding it – Franck Feb 13 '18 at 21:19
  • @Franck You've simply been lucky until know. Using `byte[]` is a *bad* idea that worked only because you haven't sent too big a buffer (not file) until now. SOAP is just an HTTP POST. By sending *buffers*, not files, you were essentially making a huge HTTP POST request to the server that had to be swallowed whole. The proper way is to use *streaming* and write to the server over that stream. This means you should use `Stream` instead of `buffer`. – Panagiotis Kanavos Feb 14 '18 at 12:33
  • @PanagiotisKanavos the largest i have sent as byte[] was around 1.2 gig and i had no issues. So you are saying it's a HTTP POST limit when it uses `string` parameters ? – Franck Feb 14 '18 at 12:37
  • @Franck Did you check the memory and CPU utilisation when you sent that 1.2GB buffer? How much time did it take before the server was able to process it? What did CPU utilization look like when the GC tried to clean up that buffer? When you upload a file using HTTP, the data is sent as a *stream* and the server can write that stream of bytes directly to the disk *immediatelly* using only a few KBs of RAM. You can do the same with WCF by using stream mode – Panagiotis Kanavos Feb 14 '18 at 12:46
  • @Franck an even better idea would be to *avoid* using SOAP services to upload files. You don't gain anything by the SOAP envelope in this case. – Panagiotis Kanavos Feb 14 '18 at 12:48
  • @PanagiotisKanavos 1.2 gig takes 50 to 55 seconds to process. Ram usages is a bit less than 4 gig and CPU i get one that reaches ~40% – Franck Feb 14 '18 at 12:48
  • @PanagiotisKanavos Also i don't only have files that screws up. I have internal webservice that take single string SQL query and the service manage which server it need to query. A single record data can be as big as 50 mb. – Franck Feb 14 '18 at 12:52
  • @Franck it could be 1 second, with usage in the KBs and CPU around 0% - writing a stream to a file is IO bound after all. You could use fewer servers to handle the same amount of traffic. Or you could simply set up – Panagiotis Kanavos Feb 14 '18 at 12:52
  • @Franck there's absolutely no reason for a *service* to accept SQL queries. A *service* is not a *data layer*. It may *use* a data layer. A service performs a distinct business function. If you wanted to bulk load data, you should use SqlBulkCopy – Panagiotis Kanavos Feb 14 '18 at 12:54
  • @PanagiotisKanavos For the file i can proably just go with `byte[]` as it always work up to 2 gig. But then if you have an idea on how to call an SQL server that is not accessible from the internet from anywhere in the world other than calling a server with a webservice acessible from the internet that has a local network connection to the SQL server then let me know. – Franck Feb 14 '18 at 12:59

0 Answers0