0

In my java project I use an external library (Semantic Measures Library) which uses an external parser. Whenever I try to run large datasets this external Library throws

The parser has encountered more than "100,000" entity expansions in this document; this is the limit imposed by the application.

I have tried

-DentityExpansionLimit=100000000

and

-DentityExpansionLimit=0

(the same for -Djdk.xml.totalEntitySizeLimit and -Djdk.xml.entityExpansionLimit=0). I have also tried to change the type of input files as suggested to me.

The Hungry Dictator
  • 3,444
  • 5
  • 37
  • 53
  • 1
    [this answer](http://stackoverflow.com/a/20482332/6730571) says you need to pass `-Djdk.xml.entityExpansionLimit=0`, which I don't see in the attempts you mentioned. – Hugues M. May 15 '17 at 10:31
  • I am sorry Hugues, I have tried that too.. I farily new to coding but I believe that the problem is that this commands are not being used by the library. My project works with large inputs (without the library) and works just fine with smaller inputs (using the library). – Isabela Mott May 15 '17 at 12:12
  • Possible duplicate of [What's causing these ParseError exceptions when reading off an AWS SQS queue in my Storm cluster](http://stackoverflow.com/questions/20482331/whats-causing-these-parseerror-exceptions-when-reading-off-an-aws-sqs-queue-in) – Arnav Borborah May 15 '17 at 12:32
  • Do you know which external parser that is, and which class name? That might suggest possibilities. – Ignazio May 16 '17 at 05:53

0 Answers0