1

I'm having an error in my Laravel 8 app which uses yajra/laravel-datatables-oracle with server-side rendering, and I can't figure it out so far. What makes it frustating is the fact that this have been working just fine for the last couple of months, and suddenly I got a report about this error just this morning.

So I have a page which shows a table. The table itself is pretty large, with more than 25 columns shown, thousands of total records, and with some nested relationships as well. For this, I use the yajra/laravel-datatables-oracle package with ajax/server-side rendering as I mentioned above. The page has pagination option enabled, and is set to show 50 records by default. The page is functioning correctly when opened in its default condition.

However, an error will happen when user changes the table's page-length to "All". After checking the the route that handles the table's ajax rendering, turns out the error happened because the returned json is not a valid json. So instead of getting a valid json like these:

"data": [
    {"id": 1, "name": "Foo"},

    // ....

    {"id": 999, "name": "Bar"}
]

I get:

"data": [
    {"id": 1, "name": "Foo"},

    // ....

    {"id": 500,
]

// or...

"data": [
    {"id": 1, "name": "Foo"},

    // ....

    {"id": 500, "name": "Ja
]

The point is that the returned response is incomplete, which makes it invalid as a json, which makes the datatables throws a json bad parsing error.

These are the things for you to know:

  • Remember, this thing have been working fine for the last couple of months.
  • Just like the example above, the json response (which is incomplete) is always changing everytime I refresh the url. Meaning that it's likely not related to "abnormal" record.
  • The error occurs only if the selected page-lenght pass a certain amount of number:
    • It's working when just being selected (up to) 180 items per page (each item has a total of >100 fields).
    • More than 180 and the error will occur. However, the incompleteness could be anywhere, not in the 181'th item. It could be in the 70th-ish, 90th-ish, etc. In fact, It never returns more than 100 records when the error occur.
  • I had thought that this might be caused by PHP's memory exhaustion. I tried to set the memory_limit of the PHP to -1 or 999999M, but nothing changes.
  • This is actually not just happening in one page, but many.
  • However, there are another pages, that's working totally fine without the error above. The size of the data that it has is roughly the same.

I really have no idea what's the cause of this. Is this because of the server's PHP config, the database (mysql), the ajax request, or what?

Please help. You're much appreciated.

Additional Info

This is one of the method's controller where the error appears.

use App\Models\Instansi;
use App\Models\InstrukturKelasPeriodeTahapan;
use App\Models\Tahapan;
use App\Models\Tahun;
use Illuminate\Http\Request;
use Yajra\DataTables\Facades\DataTables;

public function datatable(Request $request) {
    // validate request etc...

    $jadwals = InstrukturKelasPeriodeTahapan::query()
        ->withInstruktur()->withKelasPeriodeTahapan()
        ->withTarifInTahunAndPeriodeOf($tahun, $periode)
        ->withPphInTahunAndPeriodeOf($tahun, $periode)
        ->withCount([
            'pamongDalamKelasPeriodeTahapan as jumlah_pamong_yg_mengajar'
        ])
        ->whereHasTahunOf($tahun)
        ->whereHasPeriodeOf($periode)
        ->whereHasTahapanOf(Tahapan::PPP);

    if ($request->instansi) {
        if ($request->instansi == Instansi::NON_FOO) {
            $jadwals->whereHasInstansiOfNonFoo();
        } else {
            $jadwals->whereHasInstansiOf($request->instansi);
        }
    }

    return DataTables::of($jadwals)
        ->addIndexColumn()
        ->make(true);
}

This should get a total of 1134 records and used to be working just fine. Now, when the error occurs, I get less than 100 records, and the last returned record is incomplete.

These different return syntaxes still giving me incomplete result.

return response()->json(
    DataTables::of($jadwals)
        ->addIndexColumn()
        ->make(true)
);

return DataTables::of($jadwals)
    ->addIndexColumn()
    ->toJson();

return $jadwals->get();

return response()->json($jadwals->get());

However, these sintaxes is returning the complete 1134 records. But I can't use it as it's not using datatable's response format.

return dd($jadwals->get());

return $jadwals->get()->toJson();

And this is a data example of how an item/record should look like when returned completely, with 120 lines in total:

{
  "id": 9999,
  "kelas_periode_tahapan_id": 999,
  "instruktur_id": 999,
  "jumlah_pamong_yg_mengajar": 9,
  "instruktur": {
    "id": 9999,
    "id_ref": 9999,
    "nama": "Dr. Xxxxx",
    "email": "xxxxxxxx@gmail.com",
    "tgl_lahir": "95-09-9977",
    "nip": "999999999999999999",
    "nik": "99999999999999999",
    "npwp": "99999999999999999",
    "golongan_id": 9,
    "jab_fungsional_id": 9,
    "instansi_id": 9,
    "no_rek": "99999999999",
    "nama_bank": "BANK XXXXXXXX",
    "bank_cabang": "XXX",
    "nama_rek": "Xxxxxxxx Xxxxx Xxxxxxxxx",
    "is_active": 9,
    "golongan": {
      "id": 9,
      "description": "XXXXX",
      "periodes": [
        {
          "id": 9,
          "tahun_id": 9,
          "periode": 9,
          "pivot": {
            "golongan_id": 9,
            "periode_id": 9,
            "pph_npwp": 9,
            "pph_non_npwp": 9
          }
        }
      ]
    },
    "jab_fungsional": {
      "id": 9,
      "description": "Xxxxxxxx",
      "periodes": [
        {
          "id": 9,
          "tahun_id": 9,
          "periode": 9,
          "pivot": {
            "jab_fungsional_id": 9,
            "periode_id": 9,
            "tarif": 99999999
          }
        }
      ]
    },
    "instansi": {
      "id": 9,
      "description": "Xxxxxxxxxxx Xxxxxxxxxxxxx Xxxxxxxxxxxx"
    }
  },
  "kelas_periode_tahapan": {
    "id": 999,
    "kelas_id": 99,
    "periode_tahapan_id": 99,
    "is_verified": 9,
    "is_locked": 9,
    "kelas": {
      "id": 99,
      "periode_prodi_id": 9,
      "description": "Xxxxxxxxxxx Xxxxxxxxxxxxx Xxxxxxxxxxxx",
      "nama_grup_wa": "Xxxxxxxx Xxxxxxxxxxx",
      "link_grup_wa": "https://xxxxxxxxxxxxxxxxxxxxxxxxx",
      "periode_prodi": {
        "id": 9,
        "prodi_id": 9,
        "periode_id": 9,
        "prodi": {
          "id": 9,
          "nama": "Xxxxxxxxxxxxx Xxxxxxxxxxxxx",
          "kode": "999",
          "is_active": 9
        }
      }
    },
    "periode_tahapan": {
      "id": 99,
      "periode_id": 9,
      "tahapan_id": 9,
      "ket_hari_ke": 99,
      "jadwal": "09-08-9099",
      "materi": "Xxxxxxxxxxx Xxxxxxxx Xxxxxxxxxxxxxxxxxxxxx",
      "jp_full": 9,
      "jp_pamong": 9,
      "tahapan": {
        "id": 9,
        "description": "Xxxxxxxxxxx Xxxxxxxxxxxxx Xxxxxxxxxxxx",
        "akronim": "XXX",
        "satuan": "XXX"
      },
      "periode": {
        "id": 9,
        "tahun_id": 9,
        "periode": 9,
        "tahun": {
          "id": 9,
          "tahun": 9099
        },
        "persentase_honor": {
          "id": 9,
          "periode_id": 9,
          "full_dosen": 900,
          "full_gpamong": 900,
          "persentase_dosen": 99,
          "persentase_gpamong": 99
        }
      }
    }
  },
  "DT_RowIndex": 1
}
rifqy abdl
  • 156
  • 1
  • 8
  • 1
    Hi, adjust your server's memory. Had a similar issue with same the datatables. Solution was just to up server resources as the application would have run out of memory. You can go through your PHP or Server error logs and you should see an error similar to "process... ran out of memory, trying to allocate....bytes but only....available" – Charles Jan 27 '23 at 15:10
  • Hi @Charles, can you please explain what "memory" do you actually mean? Is it not the PHP `memory_limit`? The app server is handled by other people and I would like to present a clear explanation to them if possible. – rifqy abdl Jan 27 '23 at 15:14
  • 1
    In my scenario, we bumped up PHP memory_limit but the application was always breaking when we tried loading all the records via datatables. We tried changing values on .htaccess, still, the applicaation would load a maximum of 100 rows, anything above that, it would break. We then asked service provider to increase the "server memory". It was a complete upgrade of our server from e.g a 2GB Ram server to a 4GB ram server. That resolved this issue. So, in some instances, Datatables use a lot of server resources depending on how you implemented them. – Charles Jan 27 '23 at 15:18
  • What's your returned JSON **response** size from `yajra/laravel-datatables-oracle` ? – steven7mwesigwa Jan 27 '23 at 15:19
  • Hi @Charles, thanks for your clarification. One question though: how's that solution works for now? I mean, as the database record increases in the future, do I need to add the RAM as well? – rifqy abdl Jan 27 '23 at 16:18
  • Hi @steven7mwesigma. I'm using Firefox, and looking at the DevTool's Network tab, the "size" is 17.78 kB for 10 records, and 170.90 kb for suppossedly >1100 records (only 95 and "half" is actually sent this time). – rifqy abdl Jan 27 '23 at 16:22
  • 1
    @rifqyabdl, the solution now works well. We also have chart.js running on the application and that also uses a lot of resources. I just saw your response regarding the size of the Json. It is very small. Before you ask your service provider to bump up your server resources, please allow other answers to be submitted as you shouldn't be facing such a challenge if your json response is that small. – Charles Jan 27 '23 at 16:31
  • 2
    Just an observation: If you're going to use `serverSide: true`, that should be because you have too much data for the browser and for datatables to handle in a single request. If you're going to allow the users to select "All" for the page length, then that contradicts the primary (typical) reason for needing to use `serverSide: true`. I suppose you could argue that you are relying on the users rarely choosing "all" - but still, that's not really the reason why server-side processing was created in DataTables. – andrewJames Jan 27 '23 at 16:43
  • Thanks for the suggestion, @Charles. But I'm running out of ideas right now, so maybe I'll try it just in case it works. Regarding of the response size, I believe 170 kB is really just the size of the 95 records that's loaded, because it used to take at least 5 seconds to load "All" of the records. I was never paying any attention to it, but it must be more than 1 MB. Do you still remember of how big is the response size in your case? – rifqy abdl Jan 27 '23 at 16:48
  • 1
    Hi @andrewJames, thank you for your concern. I'm aware of that, but it was necessary as the client is asking for it for a couple of reason. I'm trying to find another solution for the matter as of now. In another hand, I simpy can't help myself but to find the root cause of this error as it really irritates me. – rifqy abdl Jan 27 '23 at 16:56
  • 1
    If 95 records amount to 170 kB, then 1100 records would amount to approximately 2 MB. Try bumping up your `php.ini` configuration file's *"output buffering value"* to something like 3 Megabytes. I.E: `output_buffering=3145728` – steven7mwesigwa Jan 27 '23 at 19:17
  • 1
    Hi @steven7mwesigwa, thank you for the suggestion. Will definitely try it out (probably on next monday). – rifqy abdl Jan 28 '23 at 03:31
  • Hi @steven7mwesigwa and Charles. Late update: today, somehow it's working just fine. Nobody have yet to do anything, and it somehow works like before. Big thanks for your help. I'll implement your suggestions next time if the error comes back. – rifqy abdl Jan 30 '23 at 09:28
  • Hi @steven7mwesigwa. The error eppears again today, and I've tried to change the `output_buffering`'s value, even to `1` (unlimited), but no luck. Any other idea? – rifqy abdl Jan 31 '23 at 05:31
  • Hi @Charles. The server turns out is having 32Gb of RAM. And the usage is very low out of that. Do you have another suggestion? – rifqy abdl Jan 31 '23 at 05:34
  • 1
    @rifqyabdl The [`output_buffering` configuration](https://github.com/php/php-src/blob/b9cd1cdb4f236b7e336b688b16d58a913f4d5c69/php.ini-production#L223) value is measured in `bytes`. Setting it to `1` **doesn't** mean *'unlimited'*.‍♂️ – steven7mwesigwa Jan 31 '23 at 07:21
  • @steven7mwesigwa haha, my bad for not reading [this article](https://globedrill.com/how-to-enable-disable-php-output-buffering-linux/) carefully. Still, I've tried to set it to `3145728` as you suggested and `10485760` (10 MB) as well, but the error persists. – rifqy abdl Jan 31 '23 at 07:56
  • @rifqyabdl Could you please [edit your question](https://stackoverflow.com/posts/75259821/edit) and share how you utilized the `yajra/laravel-datatables-oracle` library? That would provide more insight into the issue. – steven7mwesigwa Jan 31 '23 at 08:48
  • @rifqyabdl In addition, could you try sending all the `1100` records to the front-end **without** using the `yajra/laravel-datatables-oracle` library just to confirm if it still returns a truncated JSON response? Take a look at [JSON Responses](https://laravel.com/docs/9.x/responses#json-responses) for how to go about that. – steven7mwesigwa Jan 31 '23 at 08:55
  • @rifqyabdl, 32Gb Ram is more than enough. Means memory is not an issue. Have you tried perhaps the other suggestions from other devs here? – Charles Jan 31 '23 at 09:26
  • @steven7mwesigwa I've updated the question. Say if you need to see any other things. – rifqy abdl Jan 31 '23 at 10:44
  • Yes @Charles, I have. But still no luck. If any, please share of what you have in mind. – rifqy abdl Jan 31 '23 at 10:45
  • 1
    Will definitely share. – Charles Jan 31 '23 at 11:02
  • 1
    You can tell from the number of comments in this question that debugging this type of problem via a Stack Overflow question is very inefficient - and may well fail. S.O. is not really geared for this, unfortunately. If you cannot reliably recreate the issue, then you cannot show us how to reliably recreate the issue - and therefore there will be just a lot of guesswork... _try this, try that_. **You may get lucky.** I recommend: Starting from no code. Write the least amount of code which _can_ generate the problem. Don't write the code you need as your client-facing solution. Start afresh. – andrewJames Jan 31 '23 at 13:15
  • Also, take a much closer look at the data. (I'm not saying you haven't done that already, but...). – andrewJames Jan 31 '23 at 13:17
  • Hi @andrewJames. Thank you for the suggestion and I'll definetily keep that in mind. My number one option is to rewrite the code, but I'll try to make the error reproducable (if possible) to find its root-cause. Also, the content of the data example that I write is faked by purpose, as I only want to show its structure - so I don't really understand by what you mean by "take a closer look". Would you explain it? – rifqy abdl Jan 31 '23 at 13:54
  • 1
    "closer look" - Sorry - to clarify: It can sometimes happen that there is a problem in production data, which causes unexpected issues. That problematic data may be rare (making the problem seem to be intermittent/unpredictable. In the past (but not with this type of JSON) I have tracked down such problems by chopping the data set in half, then re-testing each half separately - and throwing away the half which does not cause the error - and repeat until I have "binary chopped" my way to the one bad record. But... this is more guesswork. Just something to bear in mind. It really may not apply. – andrewJames Jan 31 '23 at 14:13
  • 1
    @andrewJames that is the one of the initial suspect that I had. But I've checked the data and it's all structurally good and normal. Also, if I try to open the table with 100 items per page, all the 12 pages (total is <1200 records) can be viewed without any error. – rifqy abdl Feb 01 '23 at 04:09

0 Answers0