The following code works:
library(rvest)
library(plyr)
alaska <- c(1:49)
for (i in alaska) {
url <- "http://www.50states.com/facts/alaska.htm"
nodespath <- paste('//*[@id="content"]/div[1]/div[4]/ol/li[',i,']')
alaskafacts <- data.frame(facts = url %>% html() %>%
html_nodes(xpath =nodespath) %>% html_text())
alaskafacts$nm <- i
alaskafacts$facts <- alaskafacts$facts
result <- rbind.fill(result,alaskafacts)
}
I'll get this as a result:
I know the loop is working because if I change the code to this:
alaska <- c(1:48)
I'll get this as a result:
The problem I'm running into is the loop writes over itself. I'm expecting 49 lines of facts -- I'm guessing the loop erases the previous fact and then write a new one -- the last fact will always be the fact in the data.frame.
I found an example here: How can I use a loop to scrape website data for multiple webpages in R? and the code posted above follows the code in the example. And then I referenced this example: here. And the code above, I think, follows it as well.
The rbind call I have at the bottom follows the two similar examples I found on SO, yet does not save as expected.
Any suggestions?