16

Below is the html which has 5,000 records. The export is working perfectly fine. However when the records are increased to 16,000 it says network failure for all exports. In console no error is found. I am not sure about the reason. Tested in Chrome.

<html>

<head>
  <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css" rel="stylesheet" />
  <link href="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.css" rel="stylesheet" />

  <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js"></script>
  <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/extensions/export/bootstrap-table-export.min.js"></script>
</head>

<body>
  <table data-toggle="table" data-search="true" data-show-refresh="true" data-show-toggle="true" data-show-columns="true" data-show-export="true" data-minimum-count-columns="2" data-show-pagination-switch="true" data-pagination="true" data-id-field="id"
    data-page-list="[10, 25, 50, 100, ALL]" data-show-footer="false" data-side-pagination="client" data-url="https://jsonplaceholder.typicode.com/photos">
    <thead>
      <tr>
        <th data-field="id">Id</th>
        <th data-field="title">Title</th>
        <th data-field="url">URL</th>
        <th data-field="thumbnailUrl">Thumbnail URL</th>
      </tr>
    </thead>
</body>

</html>

With > 15,000 records

<html>

<head>
  <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css" rel="stylesheet" />
  <link href="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.css" rel="stylesheet" />

  <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.2.1/jquery.min.js"></script>
  <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/js/bootstrap.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/bootstrap-table.min.js"></script>
  <script src="https://cdnjs.cloudflare.com/ajax/libs/bootstrap-table/1.11.1/extensions/export/bootstrap-table-export.min.js"></script>
</head>

<body>
  <table data-toggle="table" data-search="true" data-show-refresh="true" data-show-toggle="true" data-show-columns="true" data-show-export="true" data-minimum-count-columns="2" data-show-pagination-switch="true" data-pagination="true" data-id-field="id"
    data-page-list="[10, 25, 50, 100, ALL]" data-show-footer="false" data-side-pagination="client" data-url="https://fd-files-production.s3.amazonaws.com/226483/16h4Vwxe1Wz9PZ5Gublomg?X-Amz-Expires=300&X-Amz-Date=20170906T130107Z&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIA2QBI5WP5HA3ZEA/20170906/us-east-1/s3/aws4_request&X-Amz-SignedHeaders=host&X-Amz-Signature=5d705bfd19579c8a93ff81ee076363b2f36d1f5e4540b85f7c86de7643c17055">
    <thead>
      <tr>
        <th data-field="id">Id</th>
        <th data-field="title">Title</th>
        <th data-field="url">URL</th>
        <th data-field="thumbnailUrl">Thumbnail URL</th>
      </tr>
    </thead>
</body>

</html>
yuriy636
  • 11,171
  • 5
  • 37
  • 42
Kathir
  • 2,733
  • 12
  • 39
  • 67
  • check this https://stackoverflow.com/questions/19401638/export-as-xls-file-not-work-when-large-data – RaJesh RiJo Aug 31 '17 at 16:56
  • This seems to be missing things from the question. For example what do you mean Bootstrap export options? What records are you talking about? I only see some HTML with some CSS and JavaScript loaded and an empty HTML table. – Jake Wilson Aug 31 '17 at 19:26
  • No. it has 5,000 records..you didn't notice - data-url="https://jsonplaceholder.typicode.com/photos". When you add more json data i.e 16,000 export ends with network failure error – Kathir Sep 01 '17 at 01:22
  • Seems like a memory problem. If you have access to the server's settings, you may try to tweak memory and file sizes rights. The other obvious options seem to create your exported file in several steps (*paginate* the export if you will ; maybe by increments of 5000) or use a swap file on the server. Maybe you could optimize this by exporting using the database instead of the view? – Sumi Straessle Sep 04 '17 at 22:56
  • It is running in local computer..static data and static html. i have enough memory - 8gb in my computer. – Kathir Sep 05 '17 at 02:17
  • @Kathir Please provide a URL fit to be placed in `data-url="..."` that reproduces the issue you are reporting. The current URL in `data-url` does not reproduce the problem. – Louis Sep 05 '17 at 11:01
  • 2
    I saw this on chrome and firefox, seems fine. Cant reproduce the problem – TheChetan Sep 05 '17 at 11:05
  • Louis, you can use any data-url.i just put as a reference..anything with 16,000 records will fail – Kathir Sep 05 '17 at 17:20
  • 2
    @Kathir It is up to *you* to provide the conditions that reproduce the problem. Not up to readers to fill in the blanks. – Louis Sep 05 '17 at 23:38
  • 2
    @Kathir The server for your > 15,000 example responds with a 403 (Forbidden) status code. – Louis Sep 06 '17 at 15:23
  • @Kathir This site has a service for service JSON files: https://my-json-server.typicode.com/ The front page for the site has instructions as to how to create a github repo to serve JSON data. – Louis Sep 07 '17 at 09:50
  • I am not sure how to do it..I got failed..can you please do it..i tried several times... – Kathir Sep 10 '17 at 02:48
  • However uploaded the file to https://ufile.io/2pzyd for your reference – Kathir Sep 10 '17 at 02:50
  • @Kathir the data url returns a 403 forbiddened – TheChetan May 30 '18 at 17:26
  • @Kathir I think the main problem is with the tableExport plugin the code has lot of loops in it. It seriously is not fit for a large dataset. Can you tell me what type of output you are expecting JSON and CSV should be pretty much simple. If you want that I can post that as an answer – karthick May 30 '18 at 21:20
  • As Louis already pointed the problem is not number of items, but just different source of data. AWS-S3 - just blocks access. Provide 2 links to your AWS one with even 100 rows and another with 16K or whatever you want, but just provide links which are working but not `permission denied` – Alex May 31 '18 at 16:50

2 Answers2

2

Try doing the following:

1.) Download the library files instead of using a CDN.

2.) Increase your page time-out time on your AWS server. It's possible that you don't have enough time to process all those records.

3.) It's possible that you're hitting up against some unknown client-side restriction, like javascript.options.mem.max being 128MB. (16k records may hit that.)

4.) Try another server. There might be restrictions on AWS that you can't control (e.g. memory or "time-to-live" for your connection), but if you set up your own personal dedicated server for testing, you could rule that out.

5.) Disable your "ALL" option. Do you really want people to pull 16k records at once?

6.) As a last resort, try making a server-side pagination script.

Ryan Battistone
  • 621
  • 1
  • 9
  • 22
1

This looks to be a problem with the S3 request expiration:

<?xml version="1.0" encoding="UTF-8"?>
<Error>
   <Code>AccessDenied</Code>
   <Message>Request has expired</Message>
   <X-Amz-Expires>300</X-Amz-Expires>
   <Expires>2017-09-06T13:06:07Z</Expires>
   <ServerTime>2018-06-02T00:00:15Z</ServerTime>
   <RequestId>396C37F87B33C933</RequestId>
   <HostId>pg4uY75WW5p07yvAtqhEFvvKi0FreyHlNo/gJ329aRYHP9/KgzkVxRVkH4lZkwPtw7bLET+HPl8=</HostId>
</Error>
ic3b3rg
  • 14,629
  • 4
  • 30
  • 53