The 'compile-all' target depends on 'compile-ooxml-lite'. This target does not appear to do any compilation; instead it runs lots of tests. This seems wrong; a compile target should only compile sources. The dependency also affects other targets, such as 'jar' and 'jar-src'.
The way that we work out what counts as "lite" for ooxml-lite is to run the unit test suite, and record all the OOXML schema classes used in the process. We then build the smaller poi-ooxml-schemas jar based on just the schema classes actually used in a "typical" deployment, as automatically identified by the unit test suite It's certainly needed for the jar target, as we can't build the poi-ooxml-schemas jar without this test-running step.
The problem with this approach is the overhead when changing a single source file. AFAICT the test phase will generate the same list of classes each time, only changing very rarely when tests are updated. There has to be a better way to do this. For example, create a fixed list of the classes from a recent run, and use that to build the jar. As part of the unit tests, recreate the class list, and check if it is the same. Or one could run the unit tests against the lite jar. There seems no point redoing the same work for every build regardless.
What's the concrete problem here - the time taken to run compile-ooxml-lite, or something else? We could consider having a static list in source control and a separate Jenkins job to warn us when that list goes out of date, but obviously it's more overhead than the current system.
(In reply to David North from comment #3) > What's the concrete problem here - the time taken to run compile-ooxml-lite, > or something else? The time taken to run the target, and the fact that it is rerun even if there has been no change to the sources. > We could consider having a static list in source control and a separate > Jenkins job to warn us when that list goes out of date, but obviously it's > more overhead than the current system. AFAICT there's no need for a separate Jenkins job to check if the list is out of date; just ensure that the check is done as part of the normal unit tests. If the unit tests are run against the lite jar rather than the merged input, this should show up any problems. This would check that the jar really did contain everything required. Alternatively, run the unit tests as normal, but add a new test at the end to compare the generated list with the static list. Or a combination: run against the statically created lite jar, but also record the used classes. At the end, check if any classes are no longer needed.
(In reply to Sebb from comment #4) > If the unit tests are run against the lite jar rather than the merged input, > this should show up any problems. This would check that the jar really did contain > everything required. Nope ... neither the current nor your proposed workflow guarantees that all xmlbeans fragments are included - e.g. we had problems with empty XSLFTables for quite a while and I actually don't know what (test) solved it, but back then the test were successful and the tables empty ... > (In reply to David North from comment #3) > > What's the concrete problem here - the time taken to run compile-ooxml-lite, > > or something else? > > The time taken to run the target, and the fact that it is rerun even if > there has been no change to the sources. When developing unit tests, I don't have a problem with that - I just run the unit test in question. Before check-in, I wait the 10 minutes to run the tests - in relation to the time taken to check/develop a feature this is minimal in my view. Can you be a bit more precise why you want to speed up vs. making the build more complicate?
My problem has been that at times a failed unit test causes a compile-all to fail. This rather than speed would be my reason to decouple unit tests from any of the compile targets.
(In reply to Andreas Beeker from comment #5) > (In reply to Sebb from comment #4) > > If the unit tests are run against the lite jar rather than the merged input, > > this should show up any problems. This would check that the jar really did contain > > everything required. > > Nope ... neither the current nor your proposed workflow guarantees that all > xmlbeans fragments are included - e.g. we had problems with empty XSLFTables > for quite a while and I actually don't know what (test) solved it, but back > then the test were successful and the tables empty ... The point is that it would at least show that the unit tests could run against the lite jar. > > (In reply to David North from comment #3) > > > What's the concrete problem here - the time taken to run compile-ooxml-lite, > > > or something else? > > > > The time taken to run the target, and the fact that it is rerun even if > > there has been no change to the sources. > > When developing unit tests, I don't have a problem with that - I just run > the unit test in question. Before check-in, I wait the 10 minutes to run the > tests - in relation to the time taken to check/develop a feature this is > minimal in my view. > > Can you be a bit more precise why you want to speed up vs. making the build > more complicate? When I was trying to work out what was happening with Bug 59786 the obvious way to test the problem was to build the jars and then run them with the appropriate command-line. Waiting 2-3 mins instead of 20 secs or so is a pain when making incremental changes. It's a waste of resources (human and computer). Besides, I don't believe that the build needs to be more complicated; it just needs some minor re-arrangement.
(In reply to Sebb from comment #7) > When I was trying to work out what was happening with Bug 59786 the obvious > way to test the problem was to build the jars and then run them with the > appropriate command-line. Waiting 2-3 mins instead of 20 secs or so is a > pain when making incremental changes. It's a waste of resources (human and > computer). ant compile-scratchpad java -classpath build/classes:build/scratchpad-classes org.apache.(class) That should've let you test it in seconds!