4

I'm running into an interesting problem with LightningChart where it seems to be corrupting or otherwise decimating my data depending on how far it is from the DateOrigin of the chart. My data is 1000 samples per second and I am trying to display 1-2 weeks of data at a time. I am using the ChartXY class, the x-axis type is set to "linear-highPrecision" which should have 1 ms accuracy, which is all I need and I don't need any more, I am creating a LineSeries and it's data pattern is 'ProgressiveX' and regularProgressiveStep: true.

Here's what it looks like when the data is plotted near the DateOrigin.

enter image description here

Here's what it looks like zoomed in on the data near the DateOrigin.

enter image description here

That looks fantastic! And lightning chart is doing exactly what I want!

However, I would like this data to be offset correctly to show it's true absolute time.

Here's what it looks like when I offset this data by 14 days. My code to set the relative offset looks like this.

ds.addArrayY(curve.data,step=1,start=14*24*60*60*1000)

enter image description here

Ok, it looks alright zoomed out, but what if we zoom in?

enter image description here

It's gone haywire! It looks like the X axis values are being coerced to some larger step of the X axis. It gets worse the further that you go out from the DateOrigin. My fear is that this is some built-in behavior of the engine and I am expecting too much, however, it says it has 1ms resolution, so I expect that to be respected.

Here's how I create the chart.

// Create a Line Chart.
const PumpsChart = lightningChart().ChartXY({
    // Set the chart into a div with id, 'target'. 
    // Chart's size will automatically adjust to div's size. 
    theme: Themes.lightGradient,
    container: 'PumpsChart',
    defaultAxisX: {
        type: 'linear-highPrecision'
    }
}).setTitle('') // Set chart title
.setTitleFont(new FontSettings({
    family: waveChartFontFamily,
    size: 20
}))
.setMouseInteractionWheelZoom(false)

axisPumpsChartDateTime = PumpsChart.getDefaultAxisX()
.setTickStrategy(
    AxisTickStrategies.DateTime, 
    (tickStrategy) => tickStrategy.setDateOrigin(waveDateOrigin))

axisPumpsChartPressurePSI = PumpsChart.getDefaultAxisY()
    .setTitle("Pressure (PSI)")
    .setInterval(0,10000,0,true)

Here's how I create the LineSeries

newDataSeries = targetChart.chart.addLineSeries(
                                    { 
                                        yAxis: targetChart.axis,
                                        dataPattern: {
                                            pattern: 'ProgressiveX',
                                            regularProgressiveStep: true,
                                        }
                                     }
                                );

Here's how I add data to the chart:

ds.addArrayY(curve.data,step=1,start=14*24*60*60*1000)

I would prefer not to use the AxisTickStrategies.DateTime over AxisTickStrategies.DateTime for a few reasons, my data spans weeks, 100 hours is too little, I am just fine with millisecond resolution, I don't need more than that, and I need to present my data in relative and not absolute time.

Hopefully there's some parameter that I missing that I can adjust to achieve this.

EDIT

Well, this corruption is also happening with Time tick strategy s well when the data is offset relative to the origin -636 hours. I tried this with and without ProgressiveX set as DataPattern.pattern.

enter image description here

**** EDIT 2 ****

Well, I even tried downSampling to 20 samples per second, and changed this back to AxisTickStrategies.DateTime, it's "squishing" all the points to this magic .25 second interval for some reason.

enter image description here

  • There's something a bit confusing here that I'd like to clear up. Especially since you have two questions that seem to be around the use case, but just with different time presentation (?) https://stackoverflow.com/questions/73657257/custom-tick-strategy-to-convert-relative-to-absolute-time-for-axistickstrategies Could you clarify what you want on this topic right here? You want to present data in RELATIVE time, meaning in same way as Time ticks display it, but the problem is that 100 hours is too little? – Niilo Keinänen Sep 09 '22 at 05:10
  • I suppose I am just looking for a clear description of the expected result. There's a lot of information in these questions which I appreciate but the expected visual result seems a bit ambiguous. – Niilo Keinänen Sep 09 '22 at 05:13
  • Thanks for your comments, sorry for the confusion! It is my goal to present these waveforms in absolute time, IE with time stamps from the DateTime. – Kenneth Miller Sep 09 '22 at 12:49
  • I think I was misunderstanding the limitation of the Time strategy. I am OK with the maximum level of zoom out being 100 hours. I was misinterpreting the 100 hours as “the total amount of data allowed on the chart” as long as I can pan left and right for weeks then I’m golden. What would keep me from using the Time versus DateTime strategy now would be figuring out how to introduce custom ticks to apply a time offset to the Time Strategy to display it as the year/month/day as opposed to hours from the origin. – Kenneth Miller Sep 09 '22 at 12:50
  • My two separate questions were trying to attack the problem from different angles. – Kenneth Miller Sep 09 '22 at 12:52
  • If I can fix the decimation that’s happening on the DateTime strategy I would strongly prefer to use the DateTime strategy. Otherwise I’ll figure out how to make the Time Strategy work for me. – Kenneth Miller Sep 09 '22 at 12:52
  • The main thing I want is to be able to display this 1000hz data, at the time that it occurred without it being decimated/corrupted. – Kenneth Miller Sep 09 '22 at 13:56
  • Check out setDateOrigin functionality. https://lightningchart.com/lightningchart-js-interactive-examples/examples/lcjs-example-0021-dateTimeAxisOrigin.html – Snekw Sep 09 '22 at 14:36
  • I am using this function. I have experimented with this functionality on and off. The further data gets away from the origin the more it is corrupted/decimated. – Kenneth Miller Sep 09 '22 at 15:39
  • One more observation, it seems to be decimating the data down to 0.250 (quarter second) intervals. Is there anything significant about this? Does it indicate that I may be setting something up incorrectly? – Kenneth Miller Sep 09 '22 at 17:50

2 Answers2

2

Could it be 32 bit floating point precision loss? Two weeks in seconds amounts to 1 209 600, and log2(1_209_600) = 20.2 = 21 bits of storage space. The mantissa of 32-bit binary floats is just 23 bits, leaving you with 2 bits for the fractionnal part, which would explain the 0.25 increments.

So if i'm correct, you would need to have the X position precision bumped to 64-bit floats. You're already using "linear-highPrecision" axis mode, though, which would have been my first guess for a solution, so as far as I can tell it doesn't actually increase the data precision to 64 bits, only the axis. Unless there's another solution I've missed, you would probably need to split your data into separate series.

EDIT: I'm not actually sure that's a problem on LightningChart's end, now that I've looked it up. OpenGL ES 3.0 (upon which WebGL 2 relies, and in turn LightningChart) requires that "high-precision" float parameters be stored in binary32 IEEE 754 standard floats.

Official spec, p. 53:

highp floating point values are stored in IEEE 754 single precision floating point format. Mediump and lowp floating point values have minimum range and precision requirements as detailed below and have maximum range and precision as defined by IEEE 754.

So given that information, it seems to be a bug in LCjs caused a WebGL technical limitation.

VLRoyrenn
  • 596
  • 5
  • 11
  • FYI SciChart is an alternative library which uses 64-bit precision for data & calculations prior to rendering. Final coordinates (vertices) must be 32-bit as per WebGL but all conversion to coordinate and calcs before that are done in Float64. As a disclosure (required by StackOverflow rules) I am the founder of this library. – Dr. Andrew Burnett-Thompson Jun 21 '23 at 11:27
  • @Dr.AndrewBurnett-Thompson: Do you happen to have a demo handling this kind of precision over long time series? I don't have a use case for it, but I'd be curious to see how much better SciChart handles this with the differences you're bringing forward. – VLRoyrenn Jun 21 '23 at 14:58
  • The closest demo is this one. Not a time series, but candlestick, loading 1-year of 1-minute BTC/USD data. https://github.com/ABTSoftware/SciChart.JS.Examples/tree/master/Sandbox/WebsiteDemos/BigDataCandlestickCharts Can you send us the dataset? Use the contact form on the scichart website & do mention this post as someone else may pick it up! – Dr. Andrew Burnett-Thompson Jun 21 '23 at 18:03
  • 1
    @Dr.AndrewBurnett-Thompson: Oh, I'm not working on any dataset that spans such long periods of time, I just saw the question and ended up looking into it out of curiosity. You would have to ask Kenneth Miller as a comment to his initial question if you want to run tests on data from the OP. – VLRoyrenn Jun 22 '23 at 17:26
0

I tried to produce a reference application in a similar situation - using DateTime ticks with high resolution time data (1000 Hz). The below snippet should open right here in stack overflow, generating a quite large set of test data (might take a while), and using date origin to show it with high zoom range.

Does this work for you? If yes, then maybe there is something off in your application which you could spot with the help of this reference application.

const { lightningChart, AxisTickStrategies } = lcjs
const { createProgressiveTraceGenerator } = xydata

createProgressiveTraceGenerator()
    .setNumberOfPoints(1000 * 60 * 60 * 4)
    .generate()
    .toPromise()
    .then((data) => {
        // Just for generating data set
        const dataStart = Date.now()
        return data.map((p) => ({ x: dataStart - 1000 * 60 * 60 * 4 + p.x, y: p.y }))
    })
    .then((data) => {
        // 1. Offset data X by date origin
        const dateOrigin = new Date()
        const tDateOrigin = dateOrigin.getTime()
        data = data.map((p) => ({ x: p.x - tDateOrigin, y: p.y }))

        const chart = lightningChart().ChartXY({ disableAnimations: true })
        const axisX = chart.getDefaultAxisX().setTickStrategy(AxisTickStrategies.DateTime, (ticks) =>
            ticks
                // 2. Inform Tick Strategy of date origin
                .setDateOrigin(dateOrigin),
        )

        const series = chart.addLineSeries({
            dataPattern: {
                pattern: 'ProgressiveX',
                regularProgressiveStep: true,
            },
        })

        series.add(data)
        chart.addLegendBox().add(chart)
    })
<script src="https://unpkg.com/@arction/lcjs@3.4.0/dist/lcjs.iife.js"></script>
<script src="https://unpkg.com/@arction/xydata@1.2.1/dist/xydata.iife.js"></script>

Looking back at your question, perhaps the original issue was on the tick formatting? When you zoom in this case, it ultimately shows just seconds on each tick rather than a full time. Was this the problem? If yes, then the issue is about formatting axis ticks and I'll have to see what can be done about that.

Niilo Keinänen
  • 2,187
  • 1
  • 7
  • 12
  • 1
    I might have found the cause of this issue, see my Answer to this Question. Is this precision-loss problem on long time series something the LC team is currently aware of? – VLRoyrenn Feb 03 '23 at 19:42
  • 1
    Yes, this is a known issue. The current suggested workaround on it is the so called "date origin" approach (shifting data to better utilize 32 float precision). Most likely sometime this year 2023 there will be a built-in solution introduced to hide this complexity from users. – Niilo Keinänen Feb 06 '23 at 07:03
  • I don't think dateOrigin would solve OP's issues simply because his time series stretches on for too long ("I am trying to display 1-2 weeks of data at a time") for that to work by just anchoring the date origin to the start of the series. Fixing this without increasing float precision on a series spanning such a long span of time at this precision would require moving the series' origin around as the graph gets panned and zoomed to keep the visible datapoints in a zone of high enough precision around zero, and the answer above doesn't do that. – VLRoyrenn Feb 23 '23 at 21:07
  • To my knowledge 1-2 weeks of data when using date origin shouldn't be a problem. Getting zoom range up to seconds should be feasible. – Niilo Keinänen Feb 24 '23 at 06:48
  • Seconds, yes, but OP is asking for millisecond resolution, while the limited float32 precision causes his datapoints to snap to quarter-second intervals after 2 weeks, half-second after 24 days, and second-only precision after 49 days. Granted, datasets that both cover such a long span of time while requiring such a fine precision are few and far between, but it's not something that setting dateOrigin at the start of the series can fix alone. – VLRoyrenn Feb 24 '23 at 19:16
  • 1
    We'll consider this use case (both long time span and fine precision) when investigating a built-in solution regarding using time series data easier. – Niilo Keinänen Feb 27 '23 at 07:25