1

For some reason I cannot sync anymore after row 2039, every time I try it I get the following error (from execution transcript):

[16-05-06 15:01:13:134 PDT] Blob.getBytes() [0.999 seconds]
[16-05-06 15:01:13:146 PDT] Blob.getContentType() [0 seconds]
[16-05-06 15:01:13:350 PDT] Execution failed: Empty response (line 30, file "Code") [8.735 seconds total runtime]

There is no log about this incident.

I don't think there is a problem with the data I provide after row 2039 because I created another Google Spreadsheet & Fusion Table with such data and can sync just fine.

EDIT: I am not sure if the technical limitations mentioned in this question applies for the sync process I do from Google Spreadsheet -> Script Editor.

Here is the code I use in Script Editor:

/**
 * AppsScript script to run in a Google Spreadsheet that synchronizes its
 * contents with a Fusion Table by replacing all rows.
 */

// Replace with your Fusion Table's ID (from File > About this table)
var TABLE_ID = 'YOUR TABLE ID';

// First row that has data, as opposed to header information
var FIRST_DATA_ROW = 2;

// True means the spreadsheet and table must have the same column count
var REQUIRE_SAME_COLUMNS = true;

/**
 * Replaces all rows in the Fusion Table identified by TABLE_ID with the
 * current sheet's data, starting at FIRST_DATA_ROW.
 */
function sync() {
  var tasks = FusionTables.Task.list(TABLE_ID);
  // Only run if there are no outstanding deletions or schema changes.
  if (tasks.totalItems == 0) {
    var sheet = SpreadsheetApp.getActiveSheet();
    var wholeSheet = sheet.getRange(1, 1, sheet.getLastRow(),
        sheet.getLastColumn());
    var values = wholeSheet.getValues();
    if (values.length > 1) {
      var csvBlob = Utilities.newBlob(convertToCsv_(values),
          'application/octet-stream');
      FusionTables.Table.replaceRows(TABLE_ID, csvBlob,
         { isStrict: REQUIRE_SAME_COLUMNS, startLine: FIRST_DATA_ROW - 1 });
      Logger.log('Replaced ' + values.length + ' rows');
    }
  } else {
    Logger.log('Skipping row replacement because of ' + tasks.totalItems +
        ' active background task(s)');
  }
}


/**
 * Converts the spreadsheet values to a CSV string.
 * @param {Array} data The spreadsheet values.
 * @return {string} The CSV string.
 */
function convertToCsv_(data) {
  // See https://developers.google.com/apps-script/articles/docslist_tutorial#section3
  var csv = '';
  for (var row = 0; row < data.length; row++) {
    for (var col = 0; col < data[row].length; col++) {
      var value = data[row][col].toString();
      if (value.indexOf(',') != -1 ||
          value.indexOf('\n') != -1 ||
          value.indexOf('"') != -1) {
        // Double-quote values with commas, double quotes, or newlines
        value = '"' + value.replace(/"/g, '""') + '"';
        data[row][col] = value;
      }
    }
    // Join each row's columns and add a carriage return to end of each row
    // except the last
    if (row < data.length - 1) {
      csv += data[row].join(',') + '\r\n';
    }
    else {
      csv += data[row];
    }
  }
  return csv;
}

Is there any possible way to just update certain rows/columns instead of the whole spreadsheet, or increase the limitation?

Community
  • 1
  • 1
Tirux
  • 35
  • 5
  • You've asked a question that _should_ be answered in documentation - that's not really such a good question here, because how would we _know_? We could guess, but that's about it. Maybe a Google Engineer with inside knowledge will speak up - it sometimes happens. But, I think that question isn't really your problem... you probably want to know _why is this code failing this way?_ That would involve sharing a [mcve] and the necessary data to reliably reproduce the problem. – Mogsdad May 09 '16 at 15:28
  • Thanks for providing these tips @Mogsdad , I edited my question by adding the code I use for sync between Google Spreadsheet and Fusion Tables. I am honestly not sure if the technical limitations you mentioned in that link applies for this sync process. – Tirux May 12 '16 at 16:26
  • What's the size of the csv blob you created in the sync() function? Its a byte array so you can determine the length of the blob by logging the value csvBlob.length. If that value exceeds 262,144,000 Bytes (250 MB) then you've passed the quota limit and that could be the reason why your application is failing. – TheAddonDepot Jul 23 '16 at 01:08

0 Answers0