0

I am working on a project that involves a large .las file (3d point cloud). I process this file in a variety of ways using the lidr package in R. Because of the size of the file and my hardware limitations, I decided to try to give cloud computing a try. So this is my first foray into this world. Apologies for lack of experience. I have successful set up an RStudio Server instance on GCP. I have added the file for processing in a google cloud storage bucket.

With the code below I'm able to call the file I need and load it into memory. I am stuck here as the file doesn't have its originally structure anymore. I think I understand that google stores the file in some sort of "blob" structure and that this new structure is some stream of bytes that can be parsed. The R environment lists it as "Large raw (52136558042 elements, 52.1 GB)

I have been searching online how to parse this haven't yet found an answer online. Any help that could point me in the right direction would be much appreciated.

Setup

library(googleCloudStorageR)

options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/devstorage.full_control")

Authenticate ClientID

googleAuthR::gar_set_client("auth/client_ID.json")

Authenticate service account

Sys.setenv("GCS_AUTH_FILE" = "auth/service_key.json")

googleCloudStorageR::gcs_auth()

Get bucket info

bucket <- gcs_list_buckets(projectId = "my_project")

bucket<-gcs_global_bucket(bucket$name)

objects <- gcs_list_objects("my_lidar_project123")

wholeasslas<- gcs_get_object(objects$name[[1]],

parseObject = TRUE)

httr::content(wholeasslas, type = application/vnd.las)

0 Answers0