4

Do such parsing tools exist for R? It does not have to be a lex / yacc compatible tool for my case.

(I'm an R newbie)

EDIT: I'm interested in implementing another language using R.

adamo
  • 3,584
  • 1
  • 18
  • 23

4 Answers4

9

I developed a clone of python PLY named rly. You can find it in CRAN:

install.packages("rly")

Example of usage below:

library(rly)

TOKENS = c('NAME', 'NUMBER')
LITERALS = c('=','+','-','*','/', '(',')')

Lexer <- R6Class("Lexer",
  public = list(
    tokens = TOKENS,
    literals = LITERALS,
    t_NAME = '[a-zA-Z_][a-zA-Z0-9_]*',
    t_NUMBER = function(re='\\d+', t) {
      t$value <- strtoi(t$value)
      return(t)
    },
    t_ignore = " \t",
    t_newline = function(re='\\n+', t) {
      t$lexer$lineno <- t$lexer$lineno + nchar(t$value)
      return(NULL)
    },
    t_error = function(t) {
      cat(sprintf("Illegal character '%s'", t$value[1]))
      t$lexer$skip(1)
      return(t)
    }
  )
)

Parser <- R6Class("Parser",
  public = list(
    tokens = TOKENS,
    literals = LITERALS,
    # Parsing rules
    precedence = list(c('left','+','-'),
                      c('left','*','/'),
                      c('right','UMINUS')),
    # dictionary of names
    names = new.env(hash=TRUE),
    p_statement_assign = function(doc='statement : NAME "=" expression', p) {
      self$names[[as.character(p$get(2))]] <- p$get(4)
    },
    p_statement_expr = function(doc='statement : expression', p) {
      cat(p$get(2))
      cat('\n')
    },
    p_expression_binop = function(doc="expression : expression '+' expression
                                                  | expression '-' expression
                                                  | expression '*' expression
                                                  | expression '/' expression", p) {
           if(p$get(3) == '+') p$set(1, p$get(2) + p$get(4))
      else if(p$get(3) == '-') p$set(1, p$get(2) - p$get(4))
      else if(p$get(3) == '*') p$set(1, p$get(2) * p$get(4))
      else if(p$get(3) == '/') p$set(1, p$get(2) / p$get(4))
    },
    p_expression_uminus = function(doc="expression : '-' expression %prec UMINUS", p) {
      p$set(1, -p$get(3))
    },
    p_expression_group = function(doc="expression : '(' expression ')'", p) {
      p$set(1, p$get(3))
    },
    p_expression_number = function(doc='expression : NUMBER', p) {
      p$set(1, p$get(2))
    },
    p_expression_name = function(doc='expression : NAME', p) {
      p$set(1, self$names[[as.character(p$get(2))]])
    },
    p_error = function(p) {
      if(is.null(p)) cat("Syntax error at EOF")
      else           cat(sprintf("Syntax error at '%s'", p$value))
    }
  )
)

lexer  <- rly::lex(Lexer)
parser <- rly::yacc(Parser)

while(TRUE) {
  cat('calc > ')
  s = readLines(file("stdin"), n=1)
  if(s == 'exit') break
  parser$parse(s, lexer)
}
Marek Jagielski
  • 802
  • 10
  • 17
2

AFAIK, there is no parser generator for R.

However, user created packages in R (a.k.a. "extensions") can be written in Java, C or Fortran (and R, of course). So, you could use Lex/Yacc and Bison (in case of C) or JavaCC or ANTLR (for Java) to create a lexer and parser for your language and use those in your R code.

Bart Kiers
  • 166,582
  • 36
  • 299
  • 288
1

See the qmrparser package on CRAN.

G. Grothendieck
  • 254,981
  • 17
  • 203
  • 341
-1

AFAIK, there ARE YACC grammar files in R source code. Check out these files,

R-2.15.2\src\main\gram.y R-2.15.2\src\main\gramLatex.y R-2.15.2\src\main\gramRd.y

But I'm not sure if these files are built in the offical releases ......