This Bugzilla instance is a read-only archive of historic NetBeans bug reports. To report a bug in NetBeans please follow the project's instructions for reporting issues.
In some cases it is desirable for a module to bundle its own copies of some API packages which are also present in the JRE. For example: 1. javax.lang.** and similar are present in JDK 6 but not JDK 5, so we need to bundle them. But we might prefer to use the bundled versions even on JDK 6 just because we then know exactly which version we are using, and do not need to test that the JRE's version is OK. (This was a problem for Jackpot during the JDK 6 development cycle, though presumably not as big a deal after its release.) 2. For some API packages we may wish to bundle a newer version than any that is available in the JREs we expect people to run, due to the slow JRE lifecycle. 3. We might bundle a library which is not normally present in the JRE, but a user might have installed some copy of it into the "endorsed" or "extension" areas, despite the warnings in NB release documentation never to do this. I think Radim knows something about this (issue #95741) - please update the list of blocked issues if appropriate. I also recall a problem with an ancient version of JavaHelp bundled with Solaris. Currently if the JRE (bootcp or extcp or even app-cp) contains a package which is also in a module, the JRE's version overrides it, which could result in - use of older implementation than is desired, perhaps with serious bugs - linkage errors due to version skew in public API signatures - IAE from ProxyClassLoader.definePackage in some cases, I think when part of a package has been loaded from the JRE while part is loaded from a module (this exception message should at the least be improved since it is cryptic) Current workaround is to either patch NbInstaller.CLASSPATH_PACKAGES or (?) define org.netbeans.core.startup.specialResource, but these are nonmodular workarounds. Various suggestions have been made recently. Yarda suggested that a module could declare Sealed: true which would indicate that all its public packages should be loaded from it, not delegating to the parent. I think this may be too general (we just need to restrict loading from the non-module ancestor class loaders such as the JRE) and also the name is misleading (it is used in the JAR spec to prevent split packages). But something similar to this would probably work. Note that users of such a facility must be careful. If you accidentally have the following situation: rt.jar: package p M1: package p (public, override JRE's rt.jar) M2: package q refers to package p M3 dep on M1, M2: package r refers to packages p, q then you will get linkage errors or ClassCastException's in M3, as it would see an inconsistent view of package p. In practice this would not be a threat for use cases #1 and #3 above, since you would normally be testing without the JRE package present, so M2 would never work without a dep on M1. In use case #2 it would be possible to run into this problem by forgetting a dep M2 -> M1 (which might look to be working even though M2 is in fact using the wrong version of package p).
Something like Yarda's suggestion would work, and I agree using a different name would reduce confusion. I am facing a similar situation where it would be handy to use Java 6's javax.script interfaces for plug-ins, but cannot because of the Java 5 IDE requirement. Being able to bundle and use this small package would mean the feature could be delivered before Java 6 becomes the IDE's minimum Java platform. The linkage error problem you describe is a very real one, and not easy to debug on first glance because determining the correct closure of a class can be difficult (one mistake, and you are in linkage hell!). To make matters worse, many of the symptoms aren't reported as linkage problems, but mystifying ones such as instanceof failures and ClassCastExceptions which "obviously" are right, due to two instances having common classes with different classloaders: import org.netbeans.Foo; ... if (obj instanceof Foo) { Foo foo = (Foo)obj; // throws ClassCastException("org.netbeans.Foo") If we provide an API bundling capability, I think we also need a simple tool that displays the closure of a class to aid debugging linkage problems when bundling. The classfile module has a simple test file which dumps a closure but doesn't show how a certain class was included in that list. If there is interest, I can create a simple module that creates a tree or graph showing the closure of a selected Java source file, which can be included with this change.
Regarding javax.script - this is already bundled in NB 6.0, so you can use it with a simple dep on org.netbeans.libs.jsr223/1. Without this RFE, the JDK 6 version will take precedence when available, but in the case of JSR 223 I don't foresee a problem. If a revision of the JSR is published (which is not simply a set of added disjoint packages) and we want to use it, then we would need this RFE to prevent clashes with JDK 6. Regarding a closure testing tool - probably we could implement something which runs either in a live NB VM (when given a special startup option), or "offline" as an Ant task, and detects all possible violations of class loader hygiene. I think it may be possible to find problems mechanically - no need to wait for an error to occur and have people need to debug it manually. Certainly linkage errors should be detectable mechanically; CCEs may be harder, since you cannot statically predict which objects might be handed to which code. Anyway, while I think this would be quite valuable, it is probably a separate RFE. (Ideally the VM, which would throw both LinkageError subtypes and CCE from a cast operator, would include appropriate diagnostics to begin with. There is a JDK RFE filed for this already. Not on SWAN right at the moment so I can't look it up, but I'm on CC or reporter if you care to look at it.)
As long as we accept patching of NbInstaller for 5.5.1 this is not a blocker at this moment. Of course I agree that better solution is welcomed. I'd prefer different declaration than Sealed: Re Jesse's example how deps can get broken: what about check if modules transitively depending on a module that masks packages specify these transitive deps (can we do it during build or at runtime?). JSR 233 can have an update too (it is one of libs listed in http://java.sun.com/javase/6/docs/technotes/guides/standards/index.html) One question: do we plan to do similar thing for Ant builds? Retouche does not need this. Newer version of JAX-WS/JAXB are used in some Ant builds to so we need to solve similar problem as for module execution. Either fork Ant or make sure that it sees expected versions of these libraries.
Anything involving Ant builds would be a separate issue. The Ant module creates a class loader for loading Ant which already masks out platform/lib/*.jar and it could easily mask out other things. It probably should not be tied directly to module classes since the Ant build does not (for the most part) interact much with loaded modules anyway.
*** Issue 118947 has been marked as a duplicate of this issue. ***
Not sure issue 118947 is really a duplicate of this; in that case, we simply want NbInstaller not to block loading jaxb and a few other packages from being loaded from the bootclasspath (since, in the platform app in question, these classes are needed). For that case, it would be simple enough, and satisfactory, to allow a command line switch to *exclude* some items from the list of verboten packages. This issue seems to be solving a broader, though related, problem for modules that *do* want to bundle their own versions of things that are already on the bootclasspath; in our case, we do not want to bundle these things, we just don't want access to them blocked by hardcoded package names in NbInstaller. I realize that solving this issue would probably mean those hardcoded package names would be deleted, however, this issue will probably take some time to solve, and it would be nice to have a solution to our simpler problem sooner.
I ran into this issue and then was directed to this IZ issue. I was trying to use the javax.xml.soap package and could not even though they were already available in the JRE runtime. It seems access was blocked, though I could add the JAR saaj.jar to a module then link up the dependencies and get it working. To me, and I have not studied all the eccentricities of behavior, but it seems like, at first glance anyways, the best solution for the module system is for the module classloaders to reverse the order of loading instead of have a different manifest line to or declaration indicate where to load. That way the classes in the top of the stack would be loaded from a dependent module instead of the bootstrap or JRE classes if they were made available and if not then they could easily fall through to the JRE. If classloading and dependency issues arise then it seems like the developer would have to know they need to override any packages which may end up trying to load other classes from a different version of the package held down in the stack (including the JRE level). The only time issues may arise it seems would be when a package was overridden. Then it would work for all packages the developer has not overridden and packages they want to override. I'm not sure what the idea is when Java 7 comes out, but I assume that will be a ways off before the IDE and Platform could be expected to be run on such a scheme at the JRE level where packages could be in modules and ignored as needed...if the goal with Java 7 is to break out all packages into their own modules that is, but maybe something could be done differently at that point, but it seems there is no easy way around the base class loader of the JRE to control how things are loaded at that level without completely ignoring them as it seems is occurring now, and then how is the decisions to block packages made now? Are libraries blocked if they are not a dependency of the core platform and IDE packages? Anyways, that was just a thought, as it seems like the current solution means any module one depends on which might be "Sealed" will need to block access to the lower level class loaders as well, or it still leaves the hole open for the dependent modules to have two different classes of the same class. Reversing the order is like a layered image. The pieces are placed from bottom to top yet the class loader sees them from the top down, so it filters to the class loader as the pixels are filtered to a viewers eye. So the pixels on top block the ones below so the viewer sees what the composites below has placed on stack. It seems like, again at first glance, the only time class loading issues would arise is when a package has been overridden at one layer and not all the other packages and classes which the package and classes have overridden. Though this could end up in a downward spiral where dependency after dependency has to be overridden to the point an entire subset of the JRE is overridden and held in a layer, but it seems like at that point too many packages would be intertwined and a very bad design would be in place.
One of the solutions that the Jini platform uses is its PreferredClassLoader. This classloader is aimed at making sure that implementation classes can be loaded from the correct jar, and not be confused with other/older versions in the classpath. The way that it works, is to look for META-INF/PREFERRED.LIST in the first jar of the list of it's super URLClassLoader. It then uses that list to decide on the "Preferred" status of classes that it is loading. This really helps with mobile code and with modular systems where you really want the codebase of the service/module to be in charge of the source of each class that it needs.
*** Issue 125655 has been marked as a duplicate of this issue. ***
*** Issue 125493 has been marked as a duplicate of this issue. ***
FWIW, this is seen as a high-priority bug by many. Developers expect that when using the NetBeans Platform, they can use any classes they'd be able to use in any other Swing application. They're not happy when they learn this is not true, because they've usually spent several hours trying to figure out why it fails. We actually patch the platform for this reason, with a patch almost identical to what Jarda listed in 125655 (the only material difference is that our patch also allows loading annotation classes). I never could get any other workaround to work, at least without unwanted consequences (too many extra dependencies, for example).
Likely candidates for this API can be calculated as follows: find /space/jdk6/* -name \*.jar -exec zipinfo {} \; | perl -ni -e 'if (m!^-r.+ (\S+)/[^/]+$!) {$p = $1; $p =~ s!/!\\.!g; $ps{$p} = 1} END {print "(" . join("|", sort keys %ps) . ")"}' > /tmp/packages && egrep "<(package|subpackages)>`cat /tmp/packages`</" {,contrib/}*/nbproject/project.xml | perl -pi -e 's!/nbproject/project\.xml:\s*!: !g; s!</?[a-z]+>!!g'
Created attachment 56774 [details] Candidate modules which might want to override JRE packages
The packages (or package prefixes) currently suppressed from the JRE are only: com.sun.javadoc com.sun.source com.sun.tools.javac com.sun.tools.javadoc javax.annotation javax.jws javax.lang.model javax.tools javax.xml.bind javax.xml.soap javax.xml.stream javax.xml.ws This means that some modules are bundling packages which are in fact sometimes ignored, e.g. libs.jsr223 bundles javax.script but this is picked up from the JRE when running on JDK 6+.
Created attachment 56782 [details] Proposed patch
Please review the attached patch and proposed API change.
Well done. Maybe the sample in documentation could also mention that people shall require ModuleFormat2 when adding this tag, but I can see myself this has pros and cons. Re: jsr223 module - that one is very happy the classes are taken from JRE6. It would even be better if there was a way to kill the dependency on jsr223 module if one is running on JRE6 and not only JRE5...
4e265f58c811
This issue had *4 votes* before move to platform component