2

Both are technologies that are expressed via languages full of macros, but in a more technical terms, what is the kind of grammar and how to describe their own properties ?

I'm not interested in a graphical representation, by properties I mean a descriptive phrase about this subject, so please don't just go for a BNF/EBNF oriented response full of arcs and graphs .

I assume that both are context-free grammars, but this is a big family of grammars, there is a way to describe this 2 in a more precise way ?

Thanks.

user2485710
  • 9,451
  • 13
  • 58
  • 102
  • 1
    This question appears to be off-topic because it is about theories of natural language processing and might be better asked on [Linguistics](http://linguistics.stackexchange.com/). – Thomas Mar 03 '14 at 20:09
  • 1
    @Thomas more like a question about programming/technical languages, I don't know how many programmers could possibly visit that branch of SO, I also don't think that an experienced linguistic could possibly help nearly as much as an experienced programmer . – user2485710 Mar 03 '14 at 20:32

2 Answers2

1

TeX can change the meaning of characters at run time, so it's not context free.

Ivan Andrus
  • 5,221
  • 24
  • 31
  • wait, the meaning of "macro based language" can't be translated in "you can simply perform a text substitution" to get the new meaning of the keyword ? – user2485710 Mar 04 '14 at 23:35
  • TeX definitely makes extensive use of macros, but unfortunately it's more complicated than that. In other words, it's not entirely macro based according to your definition. – Ivan Andrus Mar 05 '14 at 20:24
1

Is my language Context-Free?

I believe that every useful language ends up being Turing-complete, reflexive, etc.

Fortunately that is not the end of the story.

Most of the parser generation tools (yacc, antler, etc) process up to context-free grammars (CFG).

So we divide the language processing problem in 3 steps:

  1. Build an over-generating CFG; this is the "syntactical" part that constitutes a solid base where we add the other components,
  2. Add "semantic" constraints (with some extra syntactic and semantic constraints)
  3. main semantics ( static semantics, pragmatics, attributive semantics, etc)

Writing a context-free grammar is a very standard way of speaking about all the languages! It is a very clear and didactic notation for languages!! (and sometimes is not telling all the truth).

When We say that "is not context-free, is Turing-complete, ..." you can translate it to "you can count with lots of semantic extra work" :)

How can I speak about it?

Many choices available. I like to do a subset of the following:

  1. Write a clear semantic oriented CFG
  2. for each symbol (T or NT) add/define a set of semantic attributes
  3. for each production rule: add syntactic/semantic constraints predicates
  4. for each production rule: add a set equations to define the values of the attributes
  5. for each production rule: add a English explanation, examples, etc
JJoao
  • 4,891
  • 1
  • 18
  • 20