2

I am attempting to build a JSON file from multiple requests, but it seems to go wrong at the second one already. I am using the Meetup API, and since the total count of the results is 600+, I need more calls than just the one that returns 200 results.

I think it does store the results in the file, although it gives an error when creating a dataframe from these results:

Error in feed_push_parser(readBin(con, raw(), n), reset = TRUE) : 
  parse error: trailing garbage
          ,"lat":52.36}}  {"results":[{"utc_offset":36000
                     (right here) ------^

The functions that puts the data into a dataframe

json_to_df <- function(){
  df <- jsonlite::fromJSON('Meetup/meetupdata.json', flatten=TRUE)
}

How I collect the data:

store_open_events <- function(){
  events_url = sprintf('%s/2/open_events?&key=%s&sign=true&photo-host=public&lat=x.xx&country=x&city=Amsterdam&lon=x.xx&time=-24m,&status=past&page=200', API_BASE, API_KEY)
  k = GET(events_url)
  events = content(k, "text")
  file_location = 'Meetup/meetupdata.json'
  write(events, file_location, append=TRUE)
  for(i in 0:2){
    events_url = jsonlite::fromJSON(content(k, "text"), flatten=TRUE)
    k = GET(events_url$meta$`next`)
    events = content(k, "text")
    file_location = 'Meetup/meetupdata.json'
    write(events, file_location, append=TRUE)
  }
}

I am wondering if this is the way to go, appending the JSON file like this, and why the error occurs putting this data in a dataframe? Any help is appreciated!

Eric Lecoutre
  • 1,461
  • 16
  • 25
dnsko
  • 1,017
  • 4
  • 17
  • 29
  • Why would you not just build a data frame and save it out as an R data file, CSV or feather file? Keeping the original JSON seems like a waste. – hrbrmstr Dec 01 '16 at 14:03

0 Answers0