The solution:
Note: This code does not contain the whole setup of web channel, but only the handling of the data between QT and JS.
QT C++
The important thing is that js pako.inflate()
does not understand the header generated by qCompress
hence the Unknown compression method error
. The header contains four bytes, that at least in my case were 0-s. Trimming them fixes the error.
QString compressJson(const QJsonObject &jsonObj)
{
QJsonDocument doc(jsonObj);
QString strJson(doc.toJson(QJsonDocument::Compact));
QByteArray data = qCompress(strJson.toUtf8());
// Remove the first four bytes, so that the external (js zlib decompressor understands the size header)
// Leaving the leading 4 bytes in place will cause "unknown compression method" on the js side.
data.remove(0, 4);
return QString(data.toBase64());
}
#include <QByteArray>
#include <QJsonObject>
#include <chrono>
using namespace std::chrono;
QJsonObject getPlotDataJson()
{
// generate plotly digestable json data
}
SomeClass::SomeClass(QObject *pParent)
: QObject
{
auto start = high_resolution_clock::now();
m_data = compressJson(getPlotDataJson());
auto stop = high_resolution_clock::now();
auto duration = duration_cast<microseconds>(stop - start);
qDebug() << "SomeClass::compress: " << duration.count() << " usecs";
}
Don't forget to set your data you want to share with QWebChannel
as Q_PROPERTY
#include <QObject>
#include <QString>
class SomeClass: public QObject{
Q_OBJECT
Q_PROPERTY(QString plotData MEMBER m_data)
public:
explicit SomeClass(QObject *pParent = nullptr);
QString m_data;
};
JS Vanilla
/* Using FileReader to read very large data efficiently */
function largeuint8ArrToString(uint8arr, callback) {
var bb = new Blob([uint8arr]);
var f = new FileReader();
f.onload = function(e) {
callback(e.target.result);
};
f.readAsText(bb);
}
//! Decompresses zlib compressed plot data
function inflateData(data)
{
// 1. Decode from base64
var strData = atob(data);
// 2. Convert to to js bytearray so that pako zlib inflate can understand the data
const charData = strData.split('').map(function(x){return x.charCodeAt(0); });
const bytes = new Uint8Array(charData);
// 3. Decompress
return pako.inflate(bytes)
}
window.addEventListener('DOMContentLoaded', (event) => {
var startTime0 = performance.now()
new QWebChannel(qt.webChannelTransport, function (channel) {
var plotDataInflatedBytes = inflateData(channel.objects.SomeClass.plotData)
largeuint8ArrToString(plotDataInflatedBytes,function(text){
var mapped = JSON.parse(text)
// ... your code to process the data
});
var endTime0 = performance.now()
console.log(`new QWebChannel() exec time ${endTime0 - startTime0} milliseconds`)
});
});
Edit:
Formerly used simple loop to convert binarray to string:
let binArrayToString = function(binArray) {
let str = "";
for (let i = 0; i < binArray.length; i++) {
str += String.fromCharCode(binArray[i]);
}
return str;
}
is replaced with a more efficient method using FileReader
function largeuint8ArrToString(uint8arr, callback) {
var bb = new Blob([uint8arr]);
var f = new FileReader();
f.onload = function(e) {
callback(e.target.result);
};
f.readAsText(bb);
}
Conclusion:
Besides fixing the errors this code allowed me to plot the data in about 10 seconds instead of 30 min+. I also noticed that the performance still drops quite a bit if I double the cell count because of the binArrayToString
mapping, so it is still the bottleneck.
References:
Uint8Array to string in Javascript
How to zlib compress a QByteArray?
How to use pako.js javascript? Pako is not defined