Java had a similar but inverse problem in early versions. A counter-intuitive behavior that bit people and caused leaks.
If you instantiated a Thread, and then start() was never called, that thread object would leak. And thus potentially an entire graph of objects, via the references chains beneath it.
Obviously a thread that is never started seems pointless by design. But it could happen easily if, for an example, an error happened or an exception was thrown at some point between the instantiation line and the call of start().
The root cause was because Sun's programmers had made the early implementations of Thread get added to a ThreadGroup by default, under the hood. What would happen is that ThreadGroup stayed alive/reachable and thus it kept your app's thread object reachable too, and thus the GC would never clean it up. It was never eligible.
It ended up being the cause of a few weird leaks we saw in production.
IIRC in Java 1.4 or 1.5 Sun fixed it by ensuring the thread got cleaned up in those cases.
If you instantiated a Thread, and then start() was never called, that thread object would leak. And thus potentially an entire graph of objects, via the references chains beneath it.
Obviously a thread that is never started seems pointless by design. But it could happen easily if, for an example, an error happened or an exception was thrown at some point between the instantiation line and the call of start().
The root cause was because Sun's programmers had made the early implementations of Thread get added to a ThreadGroup by default, under the hood. What would happen is that ThreadGroup stayed alive/reachable and thus it kept your app's thread object reachable too, and thus the GC would never clean it up. It was never eligible.
It ended up being the cause of a few weird leaks we saw in production.
IIRC in Java 1.4 or 1.5 Sun fixed it by ensuring the thread got cleaned up in those cases.