2

I have an annotation processor that looks for files that contain a certain annotation. The output of the processor is a single file, that references each such annotated file.

For example, if classes X, Y, and Z contain the annotation @Foo, then the @Foo processor will generate a file like:

class FooFiles {

  Class[] getFooClasses() {
    return new Class[]{X.class,Y.class,Z.class};
  }

}

This works fine if I do a mvn clean compile, since all classes are compiled and passed to the annotation processor.

But if the classes are up to date, and I modify just one (say the X class), then a mvn compile will perform an incremental build, and since only X class is compiled, then only the X class gets passed to the annotation processor, and the generated file is:

class FooFiles {

  Class[] getFooClasses() {
    return new Class[]{X.class};
  }

}

This is bad.

I'm using maven with maven-compiler-plugin version 2.5.1, which seems to do incremental compilation.

If I update the maven-compiler-plugin to version 3.1, then any change in one file results in compilation of all files, and this problem does not occur (although compiling all files when only one file has changed may not be the solution here, other developers will complain when a module with 10K+ files needs to recompiled from scratch because of one file change). I did try setting the useIncrementalCompilation option to true in the plugin's configuration, but it seems to recompile all files regardless.

I modified my annotation processor so that it does not overwrite any existing generated file. This means that after a clean then the correct provider file is generated containing X,Y, and Z references. But if just X changes, for example, then a new provider file is not generated. This allows incremental compilation, but at the expense of remembering to do a clean where necessary.

I'm not sure if there's a general solution here, but I'm asking anyway. I guess what I really want is for the annotation processor to run over the target/classes directory after the compile phase. I might need to write a maven plugin for that, perhaps.

John Q Citizen
  • 3,138
  • 4
  • 26
  • 31

1 Answers1

2

There is a bug in maven-compiler-plugin version 3.1. that causes incremental builds to fail. Currently the latest version is 3.2 anyway.

This is bad.

Only because it breaks your current processor. As you noted yourself, it's not the best solution to rebuilt everything all the time. A better approach would be to support incremental builds. This will make builds faster and your processor compatible with more compilers, IDEs and build tools. You probably need a new way to handle your annotated classes though.

Here is a thought how you could go about supporting incremental builds.

Instead of collecting all classes in a single place like FooFiles, you generate a resource file that will list all your classes and then add each annotated class you encounter on an incremental built. Whenever FooFiles needs to be used you can read the classes from that resource file. You also need to remove classes in your list that have been deleted or aren't annotated anymore.

It won't be that easy if your processor is more complex but I the general approach should still work. You could also generate a classes for each annotated class, that dynamically registers itself somewhere, if properties files aren't enough.

Community
  • 1
  • 1
kapex
  • 28,903
  • 6
  • 107
  • 121
  • I like the idea of generating a resource file with a list of class names in them. When a class with the annotation is compiled, it makes sure that it's name is in that file. The provider class then loads the resource file and creates instances of the classes listed there. The only problem is it won't handle removing class names from the resource file. If the annotation is removed from a file, it won't been seen by the annotation processor, and it won't remove itself. Unless there's a way to know which classes are compiled but not handled by the annotation processor ... – John Q Citizen Nov 27 '14 at 05:15
  • 1
    I think it can work if the processor supports all annotations, so that it will always be used even when annotations get removed. At each execution of the processor you should be able to use the reflection/mirror api and to see if the class in the list and their annotations still exist. tbh I never tried this though. Also I'm not sure if compilation/processing will even be invoked by all IDEs when you just delete a class. If not, I guess you would need to check at runtime if the listed classes actually exist. – kapex Nov 28 '14 at 19:28
  • 1
    That will probably work, so I will mark it as an answer. However I think that bypassing annotation processing may be the better solution. I wrote a quick-n-dirty maven plugin that processes the class files in the targets/classes directory, looking for classes with the annotation, and generates a java file into targets/generated-test-sources. This plugin runs during the generate-test-sources phase. Seems to work efficiently and correctly, but I had to change the retention policy of the annotation from COMPILE to RUNTIME. – John Q Citizen Nov 30 '14 at 23:28
  • 1
    I modified my maven class processing plugin to look for class files that have changed since the last run, instead of loading all class files in a directory and processing each class file in turn. So now there is a resource file storing the class file names and their modification time, and this is compared against the files in the directory to generate a change set. The change set (with ADD,DELETE,MODIFY info) is then used to update another resource file storing the list of classes with the @Foo annotation. This is then used to generate the required code. Runs much faster now. – John Q Citizen Dec 10 '14 at 01:22