-4

I know I can open a web page from R using "shell.exec". But I want to write a loop that opens the webpage only if the webpage maintains a condition, for a collection of websites.

It's a collection of websites that have "value=somenumber" in their address, and I want R to open only those that "somenumber" actually have some data. Numbers that don't have data don't return an error page, but just a page with no data.

The thing is, it's involving like html elements that I need to write in the condition...

is such thing possible?

David Mojo
  • 29
  • 3

2 Answers2

2

I would use the stringr library's str_detect

http://cran.r-project.org/web/packages/stringr/stringr.pdf#page.7

Assuming you have a list of the urls. I would just pass that list to the function and if it returns true then do your thing.

str_detect(urlList,"value=[:digit:]")
Justin
  • 42,475
  • 9
  • 93
  • 111
  • 1
    you can do this in base with `grepl`: `grepl('value=[[:digit:]]+', urlList)`. (FWIW, `str_detect` is a thin wrapper over `grepl`) – Justin Jun 03 '14 at 21:02
  • ya it is but readability is always important. And when reading over the code str_detect makes a lot more sense then grepl. – Andrew Lichtenberg Jun 03 '14 at 21:07
2

Here's an approach that uses the more portable browseURL and grep:

x <- readLines(n=3)
http://stackoverflow.com/questions/23840523/check-if-os-is-solaris
http://stackoverflow.com/questions/23817341/faster-i-j-matrix-cell-fill
http://stackoverflow.com/questions/7863710/correlating-word-proximity

sapply(grep("/238", x, value=TRUE), browseURL)
Tyler Rinker
  • 108,132
  • 65
  • 322
  • 519