I was at a programming competition last weekend (a qualifier round for the ICPC) and I was trying to do a problem that involved (among other things) factoring integers in the range 0 <= i <= 1,000,000
into primes.
My first thought was "Oh no, factoring integers is in NP, we need to find a way to avoid spending time on it." We were working in Java. Being primarily a C programmer, my first instinct was to precompute all of the primes up to 1,000,000 with a script, format it as a Java array, and attempt to insert it into my code. My reasoning was that it save us the time of running something like a Sieve of Eratosthenes during the timed section, cut down the time complexity by something like a factor of n, and breeze through it.
I was then hit with the "code too large" error and my few-hundred-thousand int
array was rejected by the compiler. Because of the rules of the competition, it was not possible to read it in to a file, or store it anywhere except the .java file containing the class that had a main
method.
I tried breaking it up into something like 100,000 int
chunks but that didn't work because it was still too large (we had nothing to consult except for the Java 7 documentation and I wasn't able to find anything about the "code too large" error message in it). We eventually gave up, tried another method, and got it working, but it ended up costing us something like 1/4 of our total time on the competition and harmed our score significantly.
So, my question is: was there any sane way to solve this problem with a lookup table? Is it just impossible to get a huge pre-compiled lookup table into a Java program?
I'm also curious about the reasoning behind this restriction of the language... is it a security thing, somehow? Why would a compiler limit the bytecode size of methods?