I'm writing TCP server in Qt that will serve large files. Application logic is as follows:
- I've subclassed QTcpServer and reimplemented incomingConnection(int)
- In incomingConnection, I'm creating instance of "Streamer" class
- "Streamer" is using QTcpSocket which is initialized with setSocketDescriptor from incomingConnection
- When data from client arrives, I'm sending back initial response from within readyRead() slot, and then I'm connecting socket's signal bytesWritten(qint64) to Streamer's slot bytesWritten()
bytesWritten looks something like:
Streamer.h:
...
private:
QFile *m_file;
char m_readBuffer[64 * 1024];
QTcpSocket *m_socket;
...
Streamer.cpp
...
void Streamer::bytesWritten() {
if (m_socket->bytesToWrite() <= 0) {
const int bytesRead = m_file->read(m_readBuffer, 64 * 1024);
m_socket->write(m_readBuffer, bytesRead);
}
}
...
So basically I'm only writing new data when all pending data is fully written. I think that is the most asynchronous way of doing that.
And everything works correct, except it's pretty slow when there are lots of simultaneous clients.
With about 5 clients - I'm downloading from that server with speed around 1 MB/s (max of my home internet connection)
With about 140 clients - download speed is around 100-200 KB/s.
Server's internet connection is 10 Gbps and with 140 clients its use is around 100 Mbps, so I don't think that is the problem.
Server's memory usage with 140 clients - 100 MB of 2GB available
Server's CPU usage - max 20%
I'm using port 800.
When there were 140 clients on port 800 and download speed through it was like 100-200 KB/s, I've run separate copy on port 801 and was downloading at 1 MB/s without problems.
My guess is that somehow, Qt's event dispatching (or socket notifiers?) is too slow to handle all those events.
I've tried:
- Compiling whole Qt and my app with -O3
- Installing libglib2.0-dev and recompiling Qt (because QCoreApplication uses QEventDispatcherGlib or QEventDispatcherUNIX, so I wanted to see if there's any difference)
- Spawning a few threads and in incomingConnection(int) using streamer->moveToThread() depending of how much clients are currently in particular thread - that didn't make any change (though I've observed that speeds were much more varying)
- Spawning worker processes using
Code:
main.cpp:
#include <sched.h>
int startWorker(void *argv) {
int argc = 1;
QCoreApplication a(argc, (char **)argv);
Worker worker;
worker.Start();
return a.exec();
}
in main():
...
long stack[16 * 1024];
clone(startWorker, (char *)stack + sizeof(stack) - 64, CLONE_FILES, (void *)argv);
and then starting a QLocalServer in main process and passing socketDescriptors from incomingConnection(int socketDescriptor) to the worker processes. It worked correctly, but download speeds were still slow.
Also tried:
- fork()-ing process in incomingConnection() - that nearly killed the server :)
- Creating separate thread for each client - speeds dropped to 50-100 KB/s
- Using QThreadPool with QRunnable - no difference
I'm using Qt 4.8.1
I ran out of ideas.
Is it Qt-related or maybe something with the server configuration?
Or maybe I should use different language/framework/server? I need TCP server that will serve files, but I also need to perform some specific tasks between packets, so I need to implement that part myself.