You'll need an understanding of weak references to have a meaningful stab at this one.
Before reading any further give it a stab, see what you come up with. Then run it and see. Oh and yes, it compiles and runs.
...
Well, there's two possible options that you could have sensibly come up with. The first is what most people assume will always happen - the byte array will fill up to cause an OutOfMemoryError and the weak reference will never be garbage collected because there also exists a strong reference (thing) that remains in scope throughout the application and thus doesn't "let go" of the object. And indeed if you run it with early versions of Java, this is exactly what will happen.
However, with the latest version of JDK7 (update 2 at the time of writing) the program doesn't do the above, it prints out "bam" and exits cleanly. This initially seems weird - there's still a strong reference in scope so why on earth is it garbage collected?
It boils down to improvements in the JIT. It realises that the hard reference isn't actually used again, so sets it to null thus making it eligible for garbage collection. Put something like "thing.getClass();" as the last line of the program and it always forces the OutOfMemoryError because if that's the case the JIT can't make this optimisation. It's a behaviour that's actually explicitly covered in the JLS:
http://java.sun.com/docs/books/jls/third_edition/html/execution.html#12.6.1
The lesson? Objects can be garbage collected when the JIT decides they're no longer in use, and this may be earlier when it seems - and this in turn means that finalizers can be called when the object might trivially seem in-scope and thus ineligible for collection.
public class Test {
public static void main(String[] args) {
List<byte[]> list = new ArrayList<byte[]>();
Object thing = new Object();
WeakReference<Object> ref = new WeakReference<Object>(thing);
while(ref.get()!=null) {
list.add(new byte[10000]);
}
System.out.println("bam");
}
}
Before reading any further give it a stab, see what you come up with. Then run it and see. Oh and yes, it compiles and runs.
...
Well, there's two possible options that you could have sensibly come up with. The first is what most people assume will always happen - the byte array will fill up to cause an OutOfMemoryError and the weak reference will never be garbage collected because there also exists a strong reference (thing) that remains in scope throughout the application and thus doesn't "let go" of the object. And indeed if you run it with early versions of Java, this is exactly what will happen.
However, with the latest version of JDK7 (update 2 at the time of writing) the program doesn't do the above, it prints out "bam" and exits cleanly. This initially seems weird - there's still a strong reference in scope so why on earth is it garbage collected?
It boils down to improvements in the JIT. It realises that the hard reference isn't actually used again, so sets it to null thus making it eligible for garbage collection. Put something like "thing.getClass();" as the last line of the program and it always forces the OutOfMemoryError because if that's the case the JIT can't make this optimisation. It's a behaviour that's actually explicitly covered in the JLS:
Optimizing transformations of a program can be designed that reduce the number of objects that are reachable to be less than those which would naively be considered reachable. For example, a compiler or code generator may choose to set a variable or parameter that will no longer be used to null to cause the storage for such an object to be potentially reclaimable sooner.
http://java.sun.com/docs/books/jls/third_edition/html/execution.html#12.6.1
The lesson? Objects can be garbage collected when the JIT decides they're no longer in use, and this may be earlier when it seems - and this in turn means that finalizers can be called when the object might trivially seem in-scope and thus ineligible for collection.
Comments
Post a Comment