无法从Datasnap服务器检索大于260.000字节的TStream [英] Can't retrieve TStreams bigger than around 260.000 bytes from a Datasnap Server
问题描述
我已经对它进行了编程来自Delphi的 \Object Pascal\DataSnap\FireDAC 样本,也显示了这个问题。
问题可以看出只是打开该示例将ServerMethodsUnit.pas上的qOrders组件的IndexFieldName设置为空,并将其SQL属性更改为:
select * from订单
联合
选择*从订单
现在的数据量发送超过260.000字节,这似乎是您无法从客户端检索的点。获取EFDException [FireDAC] [Stan] -710。无效的二进制存储格式。
数据作为您从服务器上的FDSchemaAdapter获取的流发送,并在客户机上的另一个FDSchemaAdpater上加载。客户端和服务器之间的连接也是FireDAC。
这是服务器如何返回Stream:
函数TServerMethods.StreamGet:TStream;
begin
结果:= TMemoryStream.Create;
try
qCustomers.Close;
qCustomers.Open;
qOrders.Close;
qOrders.Open;
FDSchemaAdapter.SaveToStream(Result,TFDStorageFormat.sfBinary);
Result.Position:= 0;
除
加注;
结束
结束
这是客户端检索的方式:
procedure TClientForm.GetTables;
var
LStringStream:TStringStream;
begin
FDStoredProcGet.ExecProc;
LStringStream:= TStringStream.Create(FDStoredProcGet.Params [0] .asBlob);
尝试
如果LStringStream<> nil then
begin
LStringStream.Position:= 0;
DataModuleFDClient.FDSchemaAdapter.LoadFromStream(LStringStream,TFDStorageFormat.sfBinary);
结束
finally
LStringStream.Free;
结束
结束
客户端没有获取Blob参数的所有数据。我保存在服务器上的Stream的内容,以及到达客户端上的Blob参数的内容,并且它们具有相同的大小,但是Blob参数的内容被截断,最后几个字节为零
这是我如何在服务器上保存将要进入流的内容:
FDSchemaAdapter.SaveToFile('C:\Temp\JSON_Server.json',TFDStorageFormat.sfJSON);
这是我如何查看我在客户端blob参数中获得的内容:
TFile.WriteAllText('C:\Temp\JSON_Client.json',FDStoredProcGet.Params [0] .asBlob)
我可以看到客户端将数据截断。
您是否知道如何解决这个问题,或者将数据采集服务器中的所有Stream内容检索到客户端的解决方法?
更新:我已经更新到Delphi 10.1 Berlin Update 2,但问题依然存在。
谢谢。
我已经编写了一个解决方法。看到我不能传递大于255Kb的数据,那么我将其分割成不同的255Kb数据包并分别发送(我也添加了压缩以最小化带宽和往返)。
在服务器上,我将StremGet更改为两个不同的调用:StreamGet和StreamGetNextPacket。
函数TServerMethods.StreamGet(var Complete:布尔):TStream;
var数据:TMemoryStream;
压缩:TZCompressionStream;
begin
try
//打开数据
qCustomers.Close;
qCustomers.Open;
qOrders.Close;
qOrders.Open;
//压缩数据
尝试
如果分配(CommStream)然后FreeAndNil(CommStream);
CommStream:= TMemoryStream.Create;
数据:= TMemoryStream.Create;
压缩:= TZCompressionStream.Create(CommStream);
FDSchemaAdapter.SaveToStream(Data,TFDStorageFormat.sfBinary);
Data.Position:= 0;
Compression.CopyFrom(Data,Data.Size);
finally
Data.Free;
Compression.Free;
结束
//返回首先260000字节数据包
CommStream.Position:= 0;
结果:= TMemoryStream.Create;
Result.CopyFrom(CommStream,Min(CommStream.Size,260000));
Result.Position:= 0;
//释放内存如果所有发送
完成:=(CommStream.Position = CommStream.Size);
如果完成然后FreeAndNil(CommStream);
除
加注;
结束
结束
函数TServerMethods.StreamGetNextPacket(var Complete:boolean):TStream;
begin
//返回剩余的260000字节数据包
结果:= TMemoryStream.Create;
Result.CopyFrom(CommStream,Min(CommStream.Size - CommStream.Position,260000));
Result.Position:= 0;
//释放内存如果所有发送
完成:=(CommStream.Position = CommStream.Size);
如果完成然后FreeAndNil(CommStream);
结束
CommStream:TStream在TServerMethods上被声明为私有。
客户端以这种方式检索:
procedure TClientForm.GetTables;
var Complete:boolean;
输入:TStringStream;
数据:TMemoryStream;
解压缩:TZDecompressionStream;
begin
输入:= nil;
数据:= nil;
解压缩:= nil;
try
//获取第一个260000字节数据包
spStreamGet.ExecProc;
输入:= TStringStream.Create(spStreamGet.ParamByName('ReturnValue')。AsBlob);
完成:= spStreamGet.ParamByName('Complete')。AsBoolean;
//获取剩余的260000字节数据包
而不完成do begin
spStreamGetNextPacket.ExecProc;
Input.Position:= Input.Size;
Input.WriteBuffer(TBytes(spStreamGetNextPacket.ParamByName('ReturnValue')。AsBlob),Length(spStreamGetNextPacket.ParamByName('ReturnValue')。AsBlob));
完成:= spStreamGetNextPacket.ParamByName('Complete')。AsBoolean;
结束
//解压缩数据
Input.Position:= 0;
数据:= TMemoryStream.Create;
解压缩:= TZDecompressionStream.Create(Input);
Data.CopyFrom(Decompression,0);
Data.Position:= 0;
//加载数据集
DataModuleFDClient.FDSchemaAdapter.LoadFromStream(Data,TFDStorageFormat.sfBinary);
finally
如果分配(Input)然后FreeAndNil(Input);
如果分配(Data)然后FreeAndNil(Data);
if Assigned(Decompression)then FreeAndNil(Decompression);
结束
结束
现在工作正常。
I have a Delphi 10.1 Berlin Datasnap Server, that can't return Data packets (through a TStream) bigger than around 260.000 bytes.
I have programmed it following the \Object Pascal\DataSnap\FireDAC sample from Delphi, which also shows this problem.
The problem can be seen just opening that sample, setting blank the IndexFieldName of the qOrders component on ServerMethodsUnit.pas, and changing its SQL property to :
select * from Orders
union
select * from Orders
Now the amount of data to be send is beyond 260.000 bytes, which seems to be the point where you can't retrieve it from the client. Getting a EFDException [FireDAC][Stan]-710. Invalid binary storage format.
The data is sent as a Stream that you get from a FDSchemaAdapter on the server, and you load on another FDSchemaAdpater on the client. The connection between Client and Server is also FireDAC.
This is how the Server returns that Stream :
function TServerMethods.StreamGet: TStream;
begin
Result := TMemoryStream.Create;
try
qCustomers.Close;
qCustomers.Open;
qOrders.Close;
qOrders.Open;
FDSchemaAdapter.SaveToStream(Result, TFDStorageFormat.sfBinary);
Result.Position := 0;
except
raise;
end;
end;
And this is how the Client retrieves it :
procedure TClientForm.GetTables;
var
LStringStream: TStringStream;
begin
FDStoredProcGet.ExecProc;
LStringStream := TStringStream.Create(FDStoredProcGet.Params[0].asBlob);
try
if LStringStream <> nil then
begin
LStringStream.Position := 0;
DataModuleFDClient.FDSchemaAdapter.LoadFromStream(LStringStream, TFDStorageFormat.sfBinary);
end;
finally
LStringStream.Free;
end;
end;
The Client doesn't get all the data on the Blob parameter. I save the content of the Stream on the Server, and the content that arrives at the Blob parameter on the Client, and they have the same size, but the content of the Blob parameter has its content truncated, and the last few Kbytes are zeroes.
This is how I save on the Server the content that will go to the Stream:
FDSchemaAdapter.SaveToFile('C:\Temp\JSON_Server.json', TFDStorageFormat.sfJSON);
This is how I check what I get on the Client blob parameter:
TFile.WriteAllText('C:\Temp\JSON_Client.json', FDStoredProcGet.Params[0].asBlob);
I can see that the Client gets the data truncated.
Do you know how to fix it, or a workaround to retrieve all the Stream content from the Datasnap Server to my Client ?.
Update: I have updated to Delphi 10.1 Berlin Update 2, but the problem remains.
Thank you.
I have coded a workaround. Seeing that I can't pass data bigger than 255Kb then I split it in different 255Kb packets and send them separately (I have also added compression to minimize the bandwidth and roundtrips).
On the server I have changed StremGet to two different calls : StreamGet and StreamGetNextPacket.
function TServerMethods.StreamGet(var Complete: boolean): TStream;
var Data: TMemoryStream;
Compression: TZCompressionStream;
begin
try
// Opening Data
qCustomers.Close;
qCustomers.Open;
qOrders.Close;
qOrders.Open;
// Compressing Data
try
if Assigned(CommStream) then FreeAndNil(CommStream);
CommStream := TMemoryStream.Create;
Data := TMemoryStream.Create;
Compression := TZCompressionStream.Create(CommStream);
FDSchemaAdapter.SaveToStream(Data, TFDStorageFormat.sfBinary);
Data.Position := 0;
Compression.CopyFrom(Data, Data.Size);
finally
Data.Free;
Compression.Free;
end;
// Returning First 260000 bytes Packet
CommStream.Position := 0;
Result := TMemoryStream.Create;
Result.CopyFrom(CommStream, Min(CommStream.Size, 260000));
Result.Position := 0;
// Freeing Memory if all sent
Complete := (CommStream.Position = CommStream.Size);
if Complete then FreeAndNil(CommStream);
except
raise;
end;
end;
function TServerMethods.StreamGetNextPacket(var Complete: boolean): TStream;
begin
// Returning the rest of 260000 bytes Packets
Result := TMemoryStream.Create;
Result.CopyFrom(CommStream, Min(CommStream.Size - CommStream.Position, 260000));
Result.Position := 0;
// Freeing Memory if all sent
Complete := (CommStream.Position = CommStream.Size);
if Complete then FreeAndNil(CommStream);
end;
CommStream: TStream is declared as private on TServerMethods.
And the Client retrieves it this way :
procedure TClientForm.GetTables;
var Complete: boolean;
Input: TStringStream;
Data: TMemoryStream;
Decompression: TZDecompressionStream;
begin
Input := nil;
Data := nil;
Decompression := nil;
try
// Get the First 260000 bytes Packet
spStreamGet.ExecProc;
Input := TStringStream.Create(spStreamGet.ParamByName('ReturnValue').AsBlob);
Complete := spStreamGet.ParamByName('Complete').AsBoolean;
// Get the rest of 260000 bytes Packets
while not Complete do begin
spStreamGetNextPacket.ExecProc;
Input.Position := Input.Size;
Input.WriteBuffer(TBytes(spStreamGetNextPacket.ParamByName('ReturnValue').AsBlob), Length(spStreamGetNextPacket.ParamByName('ReturnValue').AsBlob));
Complete := spStreamGetNextPacket.ParamByName('Complete').AsBoolean;
end;
// Decompress Data
Input.Position := 0;
Data := TMemoryStream.Create;
Decompression := TZDecompressionStream.Create(Input);
Data.CopyFrom(Decompression, 0);
Data.Position := 0;
// Load Datasets
DataModuleFDClient.FDSchemaAdapter.LoadFromStream(Data, TFDStorageFormat.sfBinary);
finally
if Assigned(Input) then FreeAndNil(Input);
if Assigned(Data) then FreeAndNil(Data);
if Assigned(Decompression) then FreeAndNil(Decompression);
end;
end;
It works fine now.
这篇关于无法从Datasnap服务器检索大于260.000字节的TStream的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!