2

I am developing the download server in C++/Qt. I am facing the memory growing problem. Here I am sharing the sample server application to demonstrate the issue.

When a client is connected it starts sending 10Kb data chunks every second. When a client is disconnected, the socket is deleted.

#include <QCoreApplication>
#include <QtNetwork>

class Client: public QObject
{
    Q_OBJECT
public:
    Client(QSslSocket *sock)
    {
        this->timer = new QTimer(this);
        this->timer->setInterval(1000);
        connect(this->timer, SIGNAL(timeout()), this, SLOT(sendData()));

        this->sock = sock;
        connect(sock, SIGNAL(disconnected()), this, SIGNAL(disconnected()));         
        connect(sock, SIGNAL(encrypted()), this->timer, SLOT(start()));
        connect(sock, SIGNAL(readyRead()), this, SLOT(readData()));
    }

    ~Client()
    {
        delete this->sock;
    }

    void start()
    {
        this->sock->startServerEncryption();
    }

signals:
    void disconnected();

private slots:
    void sendData()
    {             
        qDebug() << "sending data via socket: " << sock->socketDescriptor();

        if (this->sock->bytesToWrite())
            return;

        QByteArray ba(10*1024, '1');
        this->sock->write(ba);
    }

    void readData()
    {
        this->sock->readAll();
    }

private:
    QSslSocket *sock;
    QTimer *timer;
}; // Client


class Server: public QTcpServer
{
    Q_OBJECT
public:
    Server()
    {     
        this->totalConnected = 0;
        this->instanceCounter = 0;
    }

protected:
    virtual void incomingConnection(int d);

private:
    int totalConnected;
    int instanceCounter;

private slots:
    void handleClientDisconnected();
    void handleDestroyed();
}; // Server

void Server::incomingConnection(int d)
{    
    QSslSocket *sock = new QSslSocket(this);

    if (!sock->setSocketDescriptor(d))
    {
        delete sock;
        return;
    }

    ++this->instanceCounter;

    qDebug() << "socket " << d << "connected, total: " << ++this->totalConnected;

    sock->setLocalCertificate(":/ssl/resources/my.crt");
    sock->setPrivateKey(":/ssl/resources/my.key", QSsl::Rsa, QSsl::Pem, "my.pass");

    Client *client = new Client(sock);
    connect(client, SIGNAL(disconnected()), this, SLOT(handleClientDisconnected()));
    connect(client, SIGNAL(destroyed()), this, SLOT(handleDestroyed()));

    client->start();
}

void Server::handleClientDisconnected()
{
    qDebug() << "client disconnected, total: " << --this->totalConnected;
    sender()->deleteLater();
}

void Server::handleDestroyed()
{
    qDebug() << "destroyed: " << --this->instanceCounter;
}

int main(int argc, char *argv[])
{
    QCoreApplication a(argc, argv);

    Server server;
    if (server.listen(QHostAddress::Any, 563))
        qDebug() << "listen started";
    else qDebug() << "listen failed";

    return a.exec();
}

#include "main.moc"

There are two questions about it:

1) Why is the memory continuously growing while downloading?

I was testing with 200-300 connections. It reached 400 Mb in a few minutes and was not going to stop.

At Client::sendData I check this->sock->bytesToWrite() to know whether there is something waiting to be written. So the new data is never added until everything is written. All data chunks have the same size, so it can't be that it has to allocate more memory for the new piece of data.

2) Why not all the memory is returned when all connections are closed?

Although the memory used while downloading drops when clients are disconnected, it looks like not all of it is returned. After several tests of establishing 200-700 connections, it reached 80 Mb and constantly stays at that level when there are no clients at all (client objects are all reported to be destroyed, instance counter turns to zero).

I was reading about the difference between object deletion and memory deallocation. As far as I understand the system may reserve it for future needs (some kind of optimization). But it surely must return that memory when something else needs it. I decided to write a small program that allocates a large amount of memory (namely, that is array) to see whether it will make the system to redeem the memory used by the server. It didn't. That program crashes, because there is not enough memory (it works fine when the server is just started).

So it looks like there is something wrong. I suspected the leak, but memory leak detectors didn't seem to notice any serious problems. Visual Leak Detector reported there was no memory leaks. And Valgrind reported some issues, but they were pointing to Qt libs, and I was reading they are merely false alarms, generally just the side-effect of lower level libraries deallocating memory after valgrind. Anyway, the total size of reportedly lost data is very small comparing to 80 Mb.

mentalmushroom
  • 2,261
  • 1
  • 26
  • 34

1 Answers1

0

Memory allocators can be configured to return blocks of unused memory to the system, true, but that doesn't they will or are physically able to.

First you have to look at your specific memory allocator and see if it is configured to return memory to the system. This depends on your OS and your compiler, neither of which is information you provide, but this SO question should answer that for you:

Will malloc implementations return free-ed memory back to the system?

Whether it is able to depends on the fragmentation of the heap. Only full memory blocks can be returned to the system so tiny allocations dispersed throughout the heap will prevent this (though allocators typically try to avoid this).

Community
  • 1
  • 1
Brian White
  • 8,332
  • 2
  • 43
  • 67