2

My application generates "score" values for a particular use case. These scores generally are anywhere in the range of 0-120, but most cluster in the range of 60-95.

I currently have a stat chart using counts with cardinality, e.g., 0, 1-12, 13-24, 25-36, ... 97-108, and 109+.

I'd like to instead create a percentile chart with time series lines showing percentile scores in increments of 10%, i.e., 10% score line, 20% score line, 40% score line, etc., up to 90% score line.

Is that even possible? How do I do that, beginning with recording the stat using OpenCensus Java?

jacob
  • 2,762
  • 1
  • 20
  • 49

1 Answers1

1

Cloud Monitoring doesn't really have the ability to calculate percentages at display time the way you're looking for. You can use OpenCensus to write a distribution with buckets, and you could then query their boundaries and counts - here's an example:

https://cloud.google.com/solutions/identifying-causes-of-app-latency-with-stackdriver-and-opencensus

Specifically, I'm quoting from the Accuracy section:

Monitoring computes latency percentiles for distribution metrics based on bucket boundaries at numeric intervals. This method is a common method used by Monitoring and OpenCensus where OpenCensus represents and exports metrics data to Monitoring. The TimeSeries.list method for the Cloud Monitoring API returns the bucket counts and boundaries for your project and metric types. You can retrieve the bucket boundaries in the Cloud Monitoring API BucketOptions object, which you can experiment with in the API Explorer for TimeSeries.list

Yuri Grinshteyn
  • 727
  • 3
  • 13