0

I want to export large data from mysql database to Excel file in PHP that directly can download and generated on client side download progress. So the http response don't need to wait about completely query progress and generated excel on the server, because if it's take too long it will be timed out.

I want that all progress generated on the fly download progress client side until download process 100% completed. It's possible or not? How can i do this concept?

  • Try a library that does it ... JavaScript to CSV https://github.com/mholt/PapaParse/issues/175 – JonoJames Mar 24 '22 at 02:28
  • @JonoJames I'm sorry, but in that site is no example code that what i want. I need example code with PHP. Especially php codeigniter that need showing example code about query to mysql. – Adam Satria Gunadi Mar 24 '22 at 02:34
  • No probs I dropped a little sample of what I use ... no libraries needed – JonoJames Mar 24 '22 at 02:36

3 Answers3

1

You could call the data into the browser then use a little JavaScript to put it into a CSV file dynamically .

 const data = [
["rahul", "delhi", "accounts dept"],
["rajeev", "UP", "sales dept"]
];

 let csvContent = "data:text/csv;charset=utf-8,";


data.forEach(function(rowArray) {
  let row = rowArray.join(",");
  csvContent += row + "\r\n";
});

var encodedUri = encodeURI(csvContent);
window.open(encodedUri);

So you would take your PHP Mysql result Array and

json_encode($result);

Then once you have posted it to Javascript using XHR - fetch AJAX

json.parse(result)

Then use the code above ...

JonoJames
  • 1,117
  • 8
  • 22
  • It need called data first to client side? No, that's not what i want. How about if there are 50000 data? As i said on the post, i need all progress about query progress and generated excel in download progress, like paralell until download process completed. – Adam Satria Gunadi Mar 24 '22 at 02:39
  • The first problem you will have with that is XLS files are different depending on what version of Excell the client is using ... The second problem is you would have to Generate the file on the server first .... imagine ten clients wanting that data at the same time ... then you whould need a Dynamic URL sent to the PHP page for the down load ... Trust me send the data via json as its fast and let his machine do the conversion to CVS(excell lite) – JonoJames Mar 24 '22 at 02:55
  • Yes, but that concept is already familiar for me. Query first until process completed and then assist response to client side. How about query process is 1000000 data.. It must be timed out on http response before query completed. So that's why i asking in this case, is that possible or not? But thanks for your answer. – Adam Satria Gunadi Mar 24 '22 at 03:06
  • Yes for sure see PHP page will hang his browser waiting for that data and time out ... but as an asynchronous fetch request from ajax can give him loading bar etc that wont hang his machine ... when the request is finished and data can be downloaded then the file will start downloading automatically , quick ...because data is already in his machine ... no time out on asynchronous XHR requests because when the server is ready and sent the data ... it sends a promise back to tell the browser process is complete . you can send allot more that 10 000 000 000 records in this way ... – JonoJames Mar 24 '22 at 03:11
  • "but an asynchronous fetch request from ajax can give him loading bar etc that wont hang his machine" << Maybe i need example code for this, with loading bar, and make sure the http response is not timed out. – Adam Satria Gunadi Mar 24 '22 at 03:17
  • Ok ill post another answered with the async XHR request – JonoJames Mar 24 '22 at 03:20
  • I have the same issue. PhpExcel stores whole file in memory before storing the file. So this solution as a concept is the best if you fetch the data from FE in chunks until you finish it. The xls will be stored only on client side before being saved. – marius-ciclistu Apr 23 '23 at 19:00
  • ExcelJs can stream data from browser memory to disk. – marius-ciclistu May 06 '23 at 22:10
0

You can use tableToExcel.js to export table in excel file. This works in a following way :

1). Include this CDN in your project/file

<script src="https://cdn.jsdelivr.net/gh/linways/table-to-excel@v1.0.4/dist/tableToExcel.js"></script>

2). Either Using JavaScript:

<button id="btnExport" onclick="exportReportToExcel(this)">EXPORT REPORT</button>

function exportReportToExcel() {
  let table = document.getElementsByTagName("table"); // you can use document.getElementById('tableId') as well by providing id to the table tag
  TableToExcel.convert(table[0], { // html code may contain multiple tables so here we are refering to 1st table tag
    name: `export.xlsx`, // fileName you could use any name
    sheet: {
      name: 'Sheet 1' // sheetName
    }
  });
}

In this way, you can directly download in excel format, which is a fast process

Babo
  • 357
  • 3
  • 13
  • I'm sorry but still same, that it's not concept what i want, it's basically export excel that already familiar for me. The concept that what i want is like STREAMING, imagine you watching video file that still not completed file size, but the buffering do this until process completed. Or like you download 2GB file, and you waiting from 1% to 100% download progress. I want that query database and generated excel file do in this download progress, waiting from 1%, 10%, until 100% completed. – Adam Satria Gunadi Mar 24 '22 at 03:00
0

//replace with your own function ...
var i = 0;
function move() {
  if (i == 0) {
    i = 1;
    var elem = document.getElementById("myBar");
    var width = 1;
    var id = setInterval(frame, 10);
    function frame() {
      if (width >= 100) {
        clearInterval(id);
        i = 0;
      } else {
        width++;
        elem.style.width = width + "%";
      }
    }
  }
}
#myProgress {
  width: 100%;
  background-color: #ddd;
}

#myBar {
  width: 1%;
  height: 30px;
  background-color: #04AA6D;
}
<h1>JavaScript Progress Bar</h1>

<div id="myProgress">
  <div id="myBar"></div>
</div>

<br>
<button onclick="move()">Click Me</button> 

here is how you do an a sync request

 function produceCSV(data)
 {
   let csvContent = "data:text/csv;charset=utf-8,";

   //unstring the data 
   data = JSON.parse(data);
 
   data.forEach(function(rowArray) {
      let row = rowArray.join(",");
      csvContent += row + "\r\n";
    });

    var encodedUri = encodeURI(csvContent);
    window.open(encodedUri);
 }



//simple Async fetch request 
fetch('http://time.jsontest.com/mydatafile.php')
    .then(res => res.json())
    .then((data) => {

        console.log(data);

        produceCSV(data);
 
    }).catch(err => console.error(err));

Just make sure your php file returns json_encode($data);

... And you might want to place some headers in that php file ... like

<?php
// Headers
header('Access-Control-Allow-Origin: *');
header('Content-Type: application/json');
header('Access-Control-Allow-Methods: FETCH');
header('Access-Control-Allow-Headers: Access-Control-Allow-Headers,Content-Type,Access-Control-Allow-Methods, Authorization, X-Requested-With');
header('X-Requested-With: XMLHttpRequest');
header('Accept: application/json');



include_once 'connection.php';
include_once 'models.php';

//MYSQL CALL 

$result = mysqlQuery ....
// convert the array to json string  
$result = json_encode($result);

//Echoing data to the screen sends it to the clients browser in a post 
  request ...
echo $result;

If you ever need to find anything out about a JavaScript variable

console.log(yourJSvariable)

is your friend

JonoJames
  • 1,117
  • 8
  • 22
  • That fetch is like AJAX GET Request right? I was try that just for query for 10000000 data and then timed out. It same like you try to load data 100000000 on DataTable without pagination. It must be timed out. – Adam Satria Gunadi Mar 24 '22 at 04:08
  • Key is to use the Fetch command as its asyncrounous .... plain get request is not unless you use Async/Await and return a promise – JonoJames Mar 24 '22 at 04:10
  • https://levelup.gitconnected.com/async-await-vs-promises-4fe98d11038f – JonoJames Mar 24 '22 at 04:12
  • This is ajax request: $.get("demo_test.asp", function(data, status){ alert("Data: " + data + "\nStatus: " + status); }); So basically fetch is not same like ajax get request concept? – Adam Satria Gunadi Mar 24 '22 at 04:17
  • https://stackoverflow.com/questions/46946380/fetch-api-request-timeout I think fetch is also have timed out procedure. Same with ajax request. And I don't know how long the query will take the process. – Adam Satria Gunadi Mar 24 '22 at 04:21
  • also check this setting in php.ini ..... max_execution_time ....The max_execution_time directive defines the maximum execution time for your PHP scripts and applications. Maybe the MYSQL takes to long . .... There two types of ajax request synchronous and Asyncrounous... – JonoJames Mar 24 '22 at 04:27
  • Synchronously, in which the script stops and waits for the server to send back a reply before continuing. Asynchronously, in which the script allows the page to continue to be processed and handles the reply if and when it arrives – JonoJames Mar 24 '22 at 04:31
  • Key is to find out what is timing out ... is it the PHP in which case use ------------- ini_set('max_execution_time', 300);----------------- or the web page of which fetch should solve it ... Fetch does not have a default timeout ... you can put your own timeout in if you need to ... but there not default timeout in the specification ... – JonoJames Mar 24 '22 at 04:48