0

I am trying to manually identify/correct trees using LiDAR data (1.7 GB object) and a tree tops object via the locate_trees function. Part of the problem is:

  1. Rgl is rendering very slow even though the 4 GB Nvidia 3050 should be able to handle it.
  2. The tree tops (red 3D dots) are not even showing in the rgl window. When I close the rgl window, the tree tops start popping up (red dots appear and disappear resulting in a blank white window) in a new rgl window. And if I close that window, a new tree top window opens up so I stop the process to prevent this from happening.

Does rgl automatically use the GPU or does it default to the integrated graphics on the motherboard? Is there a way to fasten up the rendering?

My other system specs are Corei9 (14 threads) and 64 GB RAM. Moreover, I am using R 4.2.1.

Code:

library(lidR)

# Import LiDAR data
LiDAR_File = readLAS("path/file_name.las")

# Find tree tops
TTops = find_trees(LiDAR_File , lmf(ws = 15, hmin = 5))

# Manually correct tree identification
TTops_Manual = locate_trees(LiDAR_File , manual(TTops)) # This is where rgl rendering becomes too slow if there are too many points involved. 
Ed_Gravy
  • 1,841
  • 2
  • 11
  • 34
  • You should post reproducible code if you want useful help. – user2554330 Aug 03 '22 at 23:14
  • I am sorry but I can't share the LiDAR data here since I don't own it. Otherwise, the code is quite simple. – Ed_Gravy Aug 03 '22 at 23:33
  • 1
    Post fake data then. See https://stackoverflow.com/q/5963269/2554330 . – user2554330 Aug 03 '22 at 23:37
  • 1
    Maybe a feature like draw distance used in games might help. Like when the user zooms in a certain level, entire points in that window are rendered, otherwise not every point is rendered. This is just a rough idea. – Ed_Gravy Aug 03 '22 at 23:37
  • 1
    Alright let me look into creating fake LiDAR point cloud. However, I am not sure if it's possible. Otherwise, the only other solution is to provide you with a link to download publically available LiDAR data. – Ed_Gravy Aug 03 '22 at 23:41
  • More generally, your question is not very clear right now. Do you want to know about whether it uses the GPU? Or do you want to know how to speed it up? Or do you have a question about the issue in 2. (which may or may not be the same issue)? – socialscientist Aug 04 '22 at 09:06
  • 2
    The issue described in 2 is most likely due to inefficient rendering in the code, which we haven't seen. Use `save <- par3d(skipRedraw=TRUE)` before calling lots of rendering commands, and then `par3d(save)` afterwards for much better speed. – user2554330 Aug 04 '22 at 09:48
  • 1
    The problem is in the `lidR::manual` function. It overrides any setting you make and redraws the plot for every tree. – user2554330 Aug 04 '22 at 18:55

2 Answers2

1

rgl has trouble displaying too many points. The plot function in lidR is convenient and allows to produce ready to publish illustrations but cannot replace a real point cloud viewer for big point clouds. I don't have GPU on my computer and I don't know if and how rgl can take advantage of GPU.

In the doc of the lidR function your are talking about you can see:

This is only suitable for small-sized plots

JRR
  • 3,024
  • 2
  • 13
  • 37
  • How many points are you talking about? I just plotted 10^7 points without any trouble, on a fairly low powered laptop. – user2554330 Aug 03 '22 at 23:17
  • 1
    10⁷ is *only* 10 millions which mean 240 MB. It is not a lot. On my computer without GPU is starts to be slow. Usable but not smooth and lagging. I think 10⁷ is the order of magnitude the limit. I think 10⁸ is impossible (on my computer it is). And 10⁶ is very. OP was talking about 1.7 GB so 7.10⁷ points + all the spheres for each tree tops and that was probably the actual issue. – JRR Aug 04 '22 at 00:03
  • 1
    Most displays have fewer than 10^7 pixels, so it doesn't usually make sense to plot that many objects. Typically it's more efficient to use textures in that case, where you don't really care about seeing all the detail. But without seeing some code, I can't say if that's feasible here or not. – user2554330 Aug 04 '22 at 09:53
  • 1
    I do agree but as far as I know it is not doable with `rgl`. `rgl` is not OpenGL, it is a high level interface with OpenGL for R. It takes 3 vectors xyz as input and process all the points no matter they are visible or not. We can feed rgl with a subset of the points but then zooming in won't allow to see the detail since points are missing. – JRR Aug 04 '22 at 10:22
  • 2D textures are certainly doable with `rgl`. – user2554330 Aug 04 '22 at 11:06
  • 2
    Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/247028/discussion-between-user2554330-and-jrr). – user2554330 Aug 04 '22 at 11:45
1

There were two problems here. First, the lidR::manual() function which is used to select trees has a loop where one sphere is drawn for each tree. By default rgl will redraw the whole scene after each change; this should be suppressed. The patch in https://github.com/r-lidar/lidR/pull/611 fixes this. You can install a version with this fix as

remotes::install_github("r-lidar/lidR")

Second, rgl was somewhat inefficient in drawing the initial point cloud of data, duplicating the data unnecessarily. When you have tens of millions of points, this can exhaust all R memory, and things slow to a crawl. The development version of rgl fixes this. It's available via

remotes::install_github("dmurdoch/rgl")

The LiDAR images are very big, so you might find you still have problems even with these changes. Getting more regular RAM will help R: you may need this if the time to the first display is too long. After the first display, almost all the work is done in the graphics system; if things are still too slow, you may need a faster graphics card (or more memory for it).

user2554330
  • 37,248
  • 4
  • 43
  • 90