Interfacing null-safe code with legacy code
When you adopt null annotations like these, your ultimate hope is that the compiler will tell you about every possible NullPointerException (NPE) in your program (except for tricks like reflection or bytecode weaving etc.). Hallelujah.
Unfortunately, most of us use libraries which don’t have the blessing of annotation based null analysis, simply because those are not annotated appropriately (neither in source nor using external annotations). Let’s for now call such code: “legacy”.
In this post I will walk through the options to warn you about the risks incurred by legacy code. The general theme will be:
Can we assert that no NPE will happen in null-checked code?
I.e., if your code consistently uses null annotations, and has passed analysis without warnings, can we be sure that NPEs can only ever be thrown in the legacy part of the code? (NPEs inside legacy code are still to be expected, there’s nothing we can change about that).
Using existing Eclipse versions, one category of problems would still go undetected whereby null-checked code could still throw NPE. This has been recently fixed
.
Simple data flows
Let’s start with simple data flows, e.g., when your program obtains a value from legacy code, like this:

You shouldn’t be surprised, the javadoc even says: “The method returns null if the property is not found.” While the compiler doesn’t read javadoc, it can recognize that a value with unspecified nullness flows into a variable with a non-null type. Hence the warning:
Null type safety (type annotations): The expression of type ‘String’ needs unchecked conversion to conform to ‘@NonNull String’
As we can see, the compiler warned us, so we are urged to fix the problem in our code. Conversely, if we pass any value into a legacy API, all bad that can happen would happen inside legacy code, so nothing to be done for our mentioned goal.
The underlying rule is: legacy values can be safely assigned to nullable variables, but not to non-null variables (example Properties.getProperty()). On the other hand, any value can be assigned to a legacy variable (or method argument).
Put differently: values flowing from null-checked to legacy pose no problems, whereas values flowing the opposite direction must be assumed to be nullable, to avoid problems in null-checked code.
Enter generics
Here be dragons.
As a minimum requirement we now need null annotations with target TYPE_USE (“type annotations”), but we have this since 2014. Good.

Here we obtain a List<String> value from a Legacy class, where indeed the list names is non-null (as can be seen by successful output from names.size()). Still things are going south in our code, because the list contained an unexpected null element.
To protect us from this problem, I marked the entire class as @NonNullByDefault, which causes the type of the variable names to become List<@NonNull String>. Now the compiler can again warn us about an unsafe assignment:
Null type safety (type annotations): The expression of type ‘List<String>’ needs unchecked conversion to conform to ‘List<@NonNull String>’
This captures the situation, where a null value is passed from legacy to null-checked code, which is wrapped in a non-null container value (the list).
Here’s a tricky question:
Is it safe to pass a null-checked value of a parameterized type into legacy code?
In the case of simple values, we saw no problem, but the following example tells us otherwise once generics are involved:

Again we have a list of type List<@NonNull String>, so dereferencing values obtained from that list should never throw NPE. Unfortunately, the legacy method printNames() succeeded to break our contract by inserting null into the list, resulting in yet another NPE thrown in null-checked code.
To describe this situation it helps to draw boundaries not only between null-checked and legacy code, but also to draw a boundary around the null-checked value of parameterized type List<@NonNull String>. That boundary is breached when we pass this value into legacy code, because that code will only see List<String> and happily invoke add(null).
This is were I recently invented a new diagnostic message:
Unsafe null type conversion (type annotations): The value of type ‘List<@NonNull String>’ is made accessible using the less-annotated type ‘List<String>’
By passing names into legacy code, we enable a hidden data flow in the opposite direction. In the general case, this introduces the risk of NPE in otherwise null-checked code. Always?
Wildcards
Java would be a much simpler language without wildcards, but a closer look reveals that wildcards actually don’t only help for type safety but also for null-safety. How so?
If the legacy method were written using a wildcard, it would not be (easily) possible to sneak in a null value, here are two attempts:

The first attempt is an outright Java type error. The second triggers a warning from Eclipse, despite the lack of null annotations:
Null type mismatch (type annotations): ‘null’ is not compatible to the free type variable ‘?’
Of course, compiling the legacy class without null-checking would still bypass our detection, but chances are already better.
If we add an upper bound to the wildcard, like in List<? extends CharSequence>, not much is changed. A lower bound, however, is an invitation for the legacy code to insert null at whim: List<? super String> will cause names.add() to accept any String, including the null value. That’s why Eclipse will also complain against lower bounded wildcards:
Unsafe null type conversion (type annotations): The value of type ‘List<@NonNull String>’ is made accessible using the less-annotated type ‘List<? super String>’
Comparing to raw types
It has been suggested to treat legacy (not null-annotated) types like raw types. Both are types with a part of the contract ignored, thereby causing risks for parts of the program that still rely on the contract.
Interestingly, raw types are more permissive in the parameterized-to-raw conversion. We are generally not protected against legacy code inserting an Integer into a List<String> when passed as a raw List.
More interestingly, using a raw type as a type argument produces an outright Java type error, so my final attempt at hacking the type system failed:

Summary
We have seen several kinds of data flow with different risks:
- Simple values flowing checked-to-legacy don’t cause any specific headache
- Simple values flowing legacy-to-checked should be treated as nullable to avoid bad surprises. This is checked.
- Values of parameterized type flowing legacy-to-checked must be handled with care at the receiving side. This is checked.
- Values of parameterized type flowing checked-to-legacy add more risks, depending on:
- nullness of the type argument (
@Nullabletype argument has no risk) - presence of wildcards, unbounded or lower-bounded.
- nullness of the type argument (
Eclipse can detect all mentioned situations that would cause NPE to be thrown from null-checked code – the capstone to be released with Eclipse 2020-03, i.e., coming soon …

[…] Interfacing null-safe code with legacy code – this post uses Eclipse IDE in the examples, but don’t let that distract you from the topic […]
Java Annotated Monthly – February 2020 | IntelliJ IDEA Blog
February 14, 2020 at 14:00
[…] Blog post Interfacing null-safe code with legacy code by Stephan Herrmann […]
How to fix "Unsafe interpretation of method return type" in Eclipse null annotation analysis - Tutorial Guruji
October 25, 2021 at 06:33