5

I have a Delphi 10.1 Berlin Datasnap Server, that can't return Data packets (through a TStream) bigger than around 260.000 bytes.

I have programmed it following the \Object Pascal\DataSnap\FireDAC sample from Delphi, which also shows this problem.

The problem can be seen just opening that sample, setting blank the IndexFieldName of the qOrders component on ServerMethodsUnit.pas, and changing its SQL property to :

select * from Orders
union 
select * from Orders

Now the amount of data to be send is beyond 260.000 bytes, which seems to be the point where you can't retrieve it from the client. Getting a EFDException [FireDAC][Stan]-710. Invalid binary storage format.

The data is sent as a Stream that you get from a FDSchemaAdapter on the server, and you load on another FDSchemaAdpater on the client. The connection between Client and Server is also FireDAC.

This is how the Server returns that Stream :

function TServerMethods.StreamGet: TStream;
begin
  Result := TMemoryStream.Create;
  try
    qCustomers.Close;
    qCustomers.Open;
    qOrders.Close;
    qOrders.Open;
    FDSchemaAdapter.SaveToStream(Result, TFDStorageFormat.sfBinary);
    Result.Position := 0;
  except
    raise;
  end;
end;

And this is how the Client retrieves it :

procedure TClientForm.GetTables;
var
  LStringStream: TStringStream;
begin
  FDStoredProcGet.ExecProc;
  LStringStream := TStringStream.Create(FDStoredProcGet.Params[0].asBlob);
  try
    if LStringStream <> nil then
    begin
      LStringStream.Position := 0;
      DataModuleFDClient.FDSchemaAdapter.LoadFromStream(LStringStream, TFDStorageFormat.sfBinary);
    end;
  finally
    LStringStream.Free;
  end;
end;

The Client doesn't get all the data on the Blob parameter. I save the content of the Stream on the Server, and the content that arrives at the Blob parameter on the Client, and they have the same size, but the content of the Blob parameter has its content truncated, and the last few Kbytes are zeroes.

This is how I save on the Server the content that will go to the Stream:

FDSchemaAdapter.SaveToFile('C:\Temp\JSON_Server.json', TFDStorageFormat.sfJSON);

This is how I check what I get on the Client blob parameter:

TFile.WriteAllText('C:\Temp\JSON_Client.json', FDStoredProcGet.Params[0].asBlob);

I can see that the Client gets the data truncated.

Do you know how to fix it, or a workaround to retrieve all the Stream content from the Datasnap Server to my Client ?.

Update: I have updated to Delphi 10.1 Berlin Update 2, but the problem remains.

Thank you.

Marc Guillot
  • 6,090
  • 1
  • 15
  • 42

6 Answers6

2

I get a similar problem with Seattle (I don't have Berlin installed) with a DataSnap server that doesn't involve FireDAC.

On my DataSnap server I have:

type
  TServerMethods1 = class(TDSServerModule)
  public
    function GetStream(Size: Integer): TStream;
    function GetString(Size: Integer): String;
  end;

[...]

uses System.StrUtils;

function BuildString(Size : Integer) : String;
var
  S : String;
  Count,
  LeftToWrite : Integer;
const
  scBlock = '%8d bytes'#13#10;
begin
  LeftToWrite := Size;
  Count := 1;
  while Count <= Size do begin
    S := Format(scBlock, [Count]);
    if LeftToWrite >= Length(S) then
    else
      S := Copy(S, 1, LeftToWrite);
    Result := Result + S;
    Inc(Count, Length(S));
    Dec(LeftToWrite, Length(S));
  end;
  if Length(Result) > 0 then
    Result[Length(Result)] := '.'
end;

function TServerMethods1.GetStream(Size : Integer): TStream;
var
  SS : TStringStream;
begin
  SS := TStringStream.Create;
  SS.WriteString(BuildString(Size));
  SS.Position := 0;
  OutputDebugString('Quality Suite:TRACING:ON');
  Result := SS;
end;

function TServerMethods1.GetString(Size : Integer): String;
begin
  Result := BuildString(Size);
end;

As you can see, both these functions build a string of the specified size using the same BuildString function and return it as a stream and a string respectively.

On two Win10 systems here, GetStream works fine for sizes up to 30716 bytes but above that, it returns an empty stream and a "size" of -1.

Otoh, GetString works fine for all sizes I have tested up to and including a size of 32000000. I have not yet managed to trace why GetStream fails. However, based on the observation that GetString does work, I tested the following work-around, which sends a stream as a string, and that works fine up to 32M as well:

function TServerMethods1.GetStreamAsString(Size: Integer): String;
var
  S : TStream;
  SS : TStringStream;
begin
  S := GetStream(Size);
  S.Position := 0;
  SS := TStringStream.Create;
  SS.CopyFrom(S, S.Size);
  SS.Position := 0;
  Result := SS.DataString;
  SS.Free;
  S.Free;
end;

I appreciate you might prefer your own work-around of sending the result in chunks.

Btw, I have tried calling my GetStream on the server by creating an instance of TServerMethods in a method of the server's main form and calling GetStream directly from that, so that the server's TDSTCPServerTransport isn't involved. This correctly returns the stream so the problem seems to be in the transport layer or the input and/or output interfaces to it.

Remy Lebeau
  • 555,201
  • 31
  • 458
  • 770
MartynA
  • 30,454
  • 4
  • 32
  • 73
  • 1
    Thanks Martyn, I could very easily change my code to pass the data as an string instead of an Stream. I didn't try it because I had read that Datasnap has a limit of 32Kb for strings, but it must be for older versions of Delphi. I will give it a try. Thanks again. – Marc Guillot Jan 26 '17 at 17:58
  • 1
    I recall tripping over a hard-coded buffer size in DB.Pas around the D7 era, but iirc that was to do with fields rather than DBX parameters, which is what server method return types are based on, afaik. Meanwhile, I've made a bit of progress with a server method that returns an (ole)variant, but am having trouble with a silent memory overwrite somewhere. – MartynA Jan 26 '17 at 18:28
  • I haven't had much luck returning a string instead of a Stream. Looks like the FireDAC components that connect to the Server truncate it, no matter the type of the parameter. I will try splitting the data in 255Kb packets and sending them separately. – Marc Guillot Feb 21 '17 at 08:31
  • 1
    @MarcGuillot: Like I said, I don't think it is FireDAC causing the problem. I'm embarrassed to say that despite spending several hours on it, so far I've failed to find out whether the truncation happens on the sending or receiving side. I may post a q about it later. – MartynA Feb 21 '17 at 09:08
  • I have posted a workaround, splitting the Stream to send into 255Kb packets and sending them separately. I would appreciate your opinion. – Marc Guillot Feb 21 '17 at 13:56
  • 1
    @MarcGuillot: I can see the attraction of your work-around, seeing as it does not have an obvious upper limit on the size of stream it could handle. Fwiw, I've updated my answer to include an alternative work-around which also works fine. – MartynA Feb 21 '17 at 15:38
  • Thanks for sharing your own workaround. I tried a GetString function (encoding to base64) but it didn't work for me with Berlin, same problem as with the Stream, it arrived truncated. Anyway it's very good to know alternatives to try. Thank you. – Marc Guillot Feb 21 '17 at 20:55
1

@Marc: I think Henrikki meant a single function, not a single function call...
I've modified your code so that only one function is enough and so that projects with different SchemaAdapters/StoredProcedures can be used.
The maximum streamsize is declared as a constant (MaxDataSnapStreamSize) and is set to $F000, wich is the MaxBuffSize a TStream.CopyFrom function handles (see System.Classes).
FComprStream is a private field of type TMemorySTream, taken care of in the constructor and destructor of the servermodule.

On the server side:

const
  MaxDataSnapStreamSize = $F000;

function TServerMethods1.StreamGet(const aFDSchemaAdapter: TFDSchemaAdapter; var aSize: Int64): TStream;
var
  lZIPStream: TZCompressionStream;
  lDataStream: TMemoryStream;
  I: Integer;
  lMinSize: Int64;
begin
if aSize=-1 then
  exit;
lDataStream:=TMemoryStream.Create;
  try
  if aSize=0 then
    begin
    FComprStream.Clear;
    with aFDSchemaAdapter do
      for I := 0 to Count-1 do
        begin
        DataSets[I].Close;
        DataSets[I].Open;
        end;
    lZIPStream := TZCompressionStream.Create(TCompressionLevel.clFastest, FComprStream);
      try
      aFDSchemaAdapter.SaveToStream(lDataStream, TFDStorageFormat.sfBinary);
      lDataStream.Position := 0;
      lZIPStream.CopyFrom(lDataStream, lDataStream.Size);
      finally
      lDataStream.Clear;
      lZIPStream.Free;
      end;
    lMinSize:=Min(FComprStream.Size, MaxDataSnapStreamSize);
    FComprStream.Position:=0;
    end
  else
    lMinSize:=Min(aSize, MaxDataSnapStreamSize);

  lDataStream.CopyFrom(FComprStream, lMinSize);
  lDataStream.Position := 0;
  aSize:=FComprStream.Size-FComprStream.Position;
  Result:=lDataStream;
  if aSize=0 then
    FComprStream.Clear;
  except
  aSize:=-1;
  lDataStream.Free;
  raise;
  end;
end;

On the client side:

procedure TdmClientModuleDS.GetTables(const aStPrGet: TFDStoredProc; const aFDSchemaAdapter: TFDSchemaAdapter);
var
  lSize: Int64;
  lZIPStream: TStringStream;
  lDataStream: TMemoryStream;
  lUNZIPStream:  TZDecompressionStream;
  I: Integer;
begin
  try
  lSize:=0;
  for I := 0 to aFDSchemaAdapter.Count-1 do
    aFDSchemaAdapter.DataSets[I].Close;
  aStPrGet.ParamByName('aSize').AsInteger:=0;
  aStPrGet.ExecProc;
  lZIPStream:=TStringStream.Create(aStPrGet.ParamByName('ReturnValue').AsBlob);
  lSize:=aStPrGet.ParamByName('aSize').AsInteger;
  while lSize>0 do
    with aStPrGet do
      begin
      ParamByName('aSize').AsInteger:=lSize;
      ExecProc;
      lZIPStream.Position:=lZIPStream.Size;
      lZIPStream.WriteBuffer(TBytes(ParamByName('ReturnValue').AsBlob),Length(ParamByName('ReturnValue').AsBlob));
      lSize:=ParamByName('aSize').AsInteger;
      end;
  lZIPStream.Position:=0;
  lDataStream:=TMemoryStream.Create;
  lUNZIPStream:=TZDecompressionStream.Create(lZIPStream);
  lDataStream.CopyFrom(lUNZIPStream, 0);
  lDataStream.Position:=0;
  aFDSchemaAdapter.LoadFromStream(lDataStream,TFDStorageFormat.sfBinary);
  finally
  if Assigned(lZIPStream) then
    FreeAndNil(lZIPStream);
  if Assigned(lDataStream) then
    FreeAndNil(lDataStream);
  if Assigned(lUNZIPStream) then
    FreeAndNil(lUNZIPStream);
  end;
end;
Filip Smet
  • 11
  • 2
0

Compress the stream on the server and uncompress it on the client. Delphi 10.1 provides the necessary classes (System.ZLib.TZCompressionStream and System.ZLib.TZDecompressionStream). The online documentation contains an example that shows how to use these routines to compress and uncompress data from and to a stream. Save the output to a ZIP file to check whether it is smaller than 260 KB.

Olaf Hess
  • 1,453
  • 11
  • 18
  • 2
    Thanks Olaf, compressing the data is a good suggestion, and may speed up the responsiveness (the server is not on the local network but hosted on Amazon's cloud). But it will only delay the problem, eventually I will hit that limit of 260Kb again. I just don't know what to do then. Should I try to send the data in several smaller calls of 260Kb each, rebuilding the Stream on the Client to feed my datasets to the FDSchemaAdapter ?. – Marc Guillot Jan 25 '17 at 21:45
  • Maybe these two questions address your problem: [Upload file to DataSnap REST server via TStream](http://stackoverflow.com/questions/18110654/upload-file-to-datasnap-rest-server-via-tstream?rq=1) and [Delphi XE2 DataSnap - Download File via TStream With Progress Bar](http://stackoverflow.com/questions/8892334/delphi-xe2-datasnap-download-file-via-tstream-with-progress-bar?rq=1) – Olaf Hess Jan 26 '17 at 09:20
0

A workaround: run a HTTP server which serves requests for the big files. The code generates and stores the file as shown in your question, and returns its URL to the client:

https://example.com/ds/... -> for the DataSnap service

https://example.com/files/... -> for big files

If you already use Apache as reverse proxy, you can configure Apache to route HTTP GET requests to resources at /files/.

For more control (authentication), you can run a HTTP server (Indy based) on a different port which serves the requests to these files. Apache may be configured to map HTTP requests to the correct destination, the client will only see one HTTP port.

mjn42
  • 830
  • 1
  • 8
  • 24
  • 1
    It would work, thank you. But it looks like way overkill for such a simple task. I should be able to pass packets bigger than 260kb without needing to set up a secondary server. I will consider it as a last resort, thank you. – Marc Guillot Jan 26 '17 at 18:08
0

I have coded a workaround. Seeing that I can't pass data bigger than 255Kb then I split it in different 255Kb packets and send them separately (I have also added compression to minimize the bandwidth and roundtrips).

On the server I have changed StremGet to two different calls : StreamGet and StreamGetNextPacket.

function TServerMethods.StreamGet(var Complete: boolean): TStream;
var Data: TMemoryStream;
    Compression: TZCompressionStream;
begin
  try
    // Opening Data
    qCustomers.Close;
    qCustomers.Open;
    qOrders.Close;
    qOrders.Open;

    // Compressing Data
    try
      if Assigned(CommStream) then FreeAndNil(CommStream);
      CommStream := TMemoryStream.Create;
      Data := TMemoryStream.Create;
      Compression := TZCompressionStream.Create(CommStream);
      FDSchemaAdapter.SaveToStream(Data, TFDStorageFormat.sfBinary);
      Data.Position := 0;
      Compression.CopyFrom(Data, Data.Size);
    finally
      Data.Free;
      Compression.Free;
    end;

    // Returning First 260000 bytes Packet
    CommStream.Position := 0;
    Result := TMemoryStream.Create;
    Result.CopyFrom(CommStream, Min(CommStream.Size, 260000));
    Result.Position := 0;

    // Freeing Memory if all sent
    Complete := (CommStream.Position = CommStream.Size);
    if Complete then FreeAndNil(CommStream);
  except
    raise;
  end;
end;

function TServerMethods.StreamGetNextPacket(var Complete: boolean): TStream;
begin
  // Returning the rest of 260000 bytes Packets
  Result := TMemoryStream.Create;
  Result.CopyFrom(CommStream, Min(CommStream.Size - CommStream.Position, 260000));
  Result.Position := 0;

  // Freeing Memory if all sent
  Complete := (CommStream.Position = CommStream.Size);
  if Complete then FreeAndNil(CommStream);
end;

CommStream: TStream is declared as private on TServerMethods.

And the Client retrieves it this way :

procedure TClientForm.GetTables;
var Complete: boolean;
    Input: TStringStream;
    Data: TMemoryStream;
    Decompression:  TZDecompressionStream;
begin
  Input := nil;
  Data := nil;
  Decompression := nil;

  try
    // Get the First 260000 bytes Packet
    spStreamGet.ExecProc;
    Input := TStringStream.Create(spStreamGet.ParamByName('ReturnValue').AsBlob);
    Complete := spStreamGet.ParamByName('Complete').AsBoolean;

    // Get the rest of 260000 bytes Packets
    while not Complete do begin
      spStreamGetNextPacket.ExecProc;
      Input.Position := Input.Size;
      Input.WriteBuffer(TBytes(spStreamGetNextPacket.ParamByName('ReturnValue').AsBlob), Length(spStreamGetNextPacket.ParamByName('ReturnValue').AsBlob));
      Complete := spStreamGetNextPacket.ParamByName('Complete').AsBoolean;
    end;

    // Decompress Data
    Input.Position := 0;
    Data := TMemoryStream.Create;
    Decompression := TZDecompressionStream.Create(Input);
    Data.CopyFrom(Decompression, 0);
    Data.Position := 0;

    // Load Datasets
    DataModuleFDClient.FDSchemaAdapter.LoadFromStream(Data, TFDStorageFormat.sfBinary);
  finally
    if Assigned(Input) then FreeAndNil(Input);
    if Assigned(Data) then FreeAndNil(Data);
    if Assigned(Decompression) then FreeAndNil(Decompression);
  end;
end;

It works fine now.

Marc Guillot
  • 6,090
  • 1
  • 15
  • 42
  • 1
    Thank you for this answer! I implemented my own from this. I made it so i only need to call one function from client to get the packages delivered in 260kb – Henrikki Feb 01 '18 at 11:58
0

The problem seems to be neither the TStream class nor the underlying DataSnap communication infrastructure, but that TFDStoredProc component creates a return parameter of type ftBlob. In first place, change the output parameter from ftBlob to ftStream. Then, change GetTables procedure to:

procedure  TClientForm.GetTables;
var
  LStringStream: TStream;
begin
  spStreamGet.ExecProc;
  LStringStream := spStreamGet.Params[0].AsStream;
  LStringStream.Position := 0;
  DataModuleFDClient.FDSchemaAdapter.LoadFromStream(LStringStream, 
  TFDStorageFormat.sfBinary);
end;
VZajac
  • 1
  • When I investigated this problem, the cause seemed to be in the JSON layer that DataSnap uses as its transport to send the stream from server to client. You may be correct about this particular point, but there seems to be a serious problem with DataSnap streams. – MartynA Feb 24 '18 at 08:06