2

My case is the following,

I am using Google Charts for my web application, in which I present some data in charts. The amount of data points I present, varies from some thousands to some hundred thousands.

I observed that after an amount of data points (~ 10.000) the Google Charts rendering script (the one that is downloaded from google) starts being unresponsive. As I increase the data the script fails to build the charts and my browser hangs. If I try to do something different with the data, that not includes Google Charts, the browser behaves as expected.

My computer has 8GB and 4 of them were empty while I was running those test cases.

So is there any limitation in the amount of data that Google Charts API can handle? Is this a browser / computer specific problem?

This Question is not a duplicate of: Google Chart Line, limit on the size or length of the array or data

Community
  • 1
  • 1
Athafoud
  • 2,898
  • 3
  • 40
  • 58

2 Answers2

3

I saw a question similar to this, on one of "Google Visualization API" forums:

Google Visualization API › What is the data size limitations

There is no limit on the size of the data, it is how much drawing is needed to represent that data that's the issue.

Athafoud
  • 2,898
  • 3
  • 40
  • 58
danabnormal
  • 462
  • 2
  • 8
  • Nice catch! But still I am trying to find something more concrete, in order to identify if the charts will 'crash' in advance. – Athafoud May 20 '15 at 15:47
  • Out of interest, have you tried the data with different chart types? I had a stacked bar chart which overloaded the browser as it was many many rectangles being drawn to visualize a years worth of data. Switching to an Area chart saw a great improvement in responsiveness. – danabnormal May 20 '15 at 16:50
  • Since is time-series data it is essential for me to use line charts. I have also used Area chart, but not for that amount of data. I can try out to see if works! – Athafoud May 20 '15 at 17:19
2

I am currently testing this. I am at 61,000 points on a line chart loaded in 7 seconds, but I am looking to make my code as efficent as possible. Also the data needs to be downloaded so caching is important. My aim is to be able to hold 500,000 to 1,000,000 points and have them download reasonably quickly.

I think the limit is on the server, client pc and the code that makes it all happen. If you have code that is looping all over the place or if you use php to pass the data to the client then you can reach you limit fast.

I have found directly reading the .json file greatly increase the limit. PHP died around 30,000.

PMax
  • 21
  • 1
  • 1
    Overnight testing led me to find my limit is really 50k rows with 10 columns before refreshing the chart can cause the code to crash. Over 50k the chart is also laggy on my laptop, i7, but I need it to run on mobile so I will cap below 50k. I did manage to render 150k points and could have gone further but as I require the chart to refresh, displaying and rerendering another chart just didn't play ball. I am now focusing on filtering the data to remove results that are the same over and over. – PMax Jan 19 '19 at 10:55