I keep running into the same problem on teams of all sizes: a project builds on one machine but fails on another because the Java version silently differs. The code is correct, the tests are solid, yet the build refuses to cooperate. That friction usually traces back to a missing or inconsistent Java version declaration in Maven. In this guide I show you how I set the Java version in Maven so the build is deterministic across laptops, CI agents, and containers. I’ll cover the compiler plugin, toolchains, profiles, and how I verify the effective configuration. You’ll also see which approach I pick in 2026 and why, how to handle multi-module builds, and the real-world edge cases that cause the most confusion.
I write this as someone who maintains large Maven builds and reviews build files for a living. You should walk away with a clear mental model and copy‑ready snippets that you can paste into pom.xml and run right away.
Why the Java version in Maven can drift without you noticing
Maven itself does not enforce a specific JDK unless you tell it to. When you run mvn package, the Maven process uses the JAVA_HOME or the java on your PATH, and the Maven Compiler Plugin picks a default source and target (historically 1.5 if you don’t specify, though modern Maven versions may set a higher baseline). That means two developers on the same codebase can compile different bytecode levels without a single warning. In practice, the drift shows up as:
- Compiles on Java 21 but fails on Java 17 due to preview or removed APIs.
- Class files are compiled for a version that the runtime does not support (for example, building on Java 21 and running on Java 17 yields
Unsupported major.minor version). - CI runs on a different JDK than local machines, so tests fail only in CI.
- Multi-module builds accidentally mix bytecode targets, breaking shading or test execution.
I treat the Java version in Maven as part of the API contract of the build. If you don’t lock it down, you’re leaving an unpredictable variable in the system.
The core concept: separate “runtime JDK” from “compile target”
A common misunderstanding is that the “Java version” means one thing. In reality, there are at least two distinct concerns:
1) The JDK that runs Maven and the compiler. This is the runtime of the build itself.
2) The language level and bytecode level you want the output to target.
You can, for example, run Maven with JDK 21 but target Java 17 bytecode. That can be a valid strategy when your CI and developer machines are on a newer JDK but production is still on 17. The right setup explicitly encodes this difference. If you only set source and target, Maven will compile for the requested version but still use whatever JDK runs the build. That can fail when the requested target is higher than the running JDK, or when you rely on APIs that are only present in a newer JDK.
In my experience, the safest model is:
- Define the desired language/bytecode level explicitly (source/target or release).
- Define the JDK used for compilation explicitly (toolchains or CI build JDK).
- Verify what Maven actually resolved by inspecting the effective POM.
The simplest and most common approach: Maven Compiler Plugin
This is the baseline I set in nearly every project. It’s simple, explicit, and easy to maintain. The modern default is to use the release flag instead of source and target because it correctly configures both the language level and the standard library API level.
Here is a complete, runnable example you can paste into a fresh Maven project.
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
com.acme
java-version-demo
1.0.0
17
UTF-8
org.apache.maven.plugins
maven-compiler-plugin
3.13.0
${maven.compiler.release}
Why do I set it this way?
- The
propertiessection keeps your target version centralized, so you can reuse it elsewhere (Surefire, Javadoc, Enforcer, etc.). releaseis more correct than setting bothsourceandtarget, because it also selects the appropriate standard library API.- Explicit plugin version removes uncertainty when the plugin updates on different machines.
You can use source and target if you need very old compatibility or have a plugin that doesn’t support release, but in 2026 I almost always use release.
When you should prefer release over source and target
I recommend release in these cases:
- You need to ensure the bytecode and the available JDK APIs match a specific version.
- You want builds to fail quickly if the running JDK is too old for the target.
- You are supporting a runtime lower than your build JDK.
Using source and target without release can lead to accidental usage of newer APIs that are not available at runtime. I have seen that issue in production systems more times than I can count.
Toolchains: the most reliable way to force the compiler JDK
Setting release is necessary but not sufficient if your environment has multiple JDKs. When a developer has JDK 21 and JDK 17 installed, Maven will run on whatever their JAVA_HOME points to. That means their compile JDK could be inconsistent from the rest of the team.
Toolchains solve this by telling Maven which JDK to use for compilation. This is the approach I prefer for stable CI and for teams that use multiple JDKs locally.
Step 1: Create a toolchains file
On each developer machine (and in CI), create a toolchains.xml. The standard location is:
- macOS/Linux:
~/.m2/toolchains.xml - Windows:
%USERPROFILE%\.m2\toolchains.xml
Example toolchains.xml:
jdk
17
temurin
/Library/Java/JavaVirtualMachines/temurin-17.jdk/Contents/Home
Step 2: Enable toolchains in the POM
Add the toolchains plugin to your build. This tells Maven to use the JDK from the toolchain when compiling.
org.apache.maven.plugins
maven-toolchains-plugin
3.2.0
toolchain
17
temurin
This is not overkill. It’s the most consistent way I’ve found to ensure the correct compiler JDK is used everywhere. You can still run Maven with any JDK, but the compiler will be the one you specify.
Toolchains vs. just setting JAVA_HOME
I often compare them like this:
JAVA_HOMEis a per‑shell decision that people forget to set.- Toolchains are an explicit declaration that Maven reads every time.
If you are working on a team or in CI, toolchains are worth the extra setup. If you are in a solo project with one JDK installed, you might skip them, but I still use them in my own projects to prevent future surprises.
Multi-module builds: one place to define the version
Most real Maven projects have parent and child modules. You should define the Java version in the parent POM to avoid mismatches. The idea is to put the properties and plugin config once and let the children inherit them.
Here is a parent POM snippet that you can reuse:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
com.acme
parent
1.0.0
pom
21
UTF-8
org.apache.maven.plugins
maven-compiler-plugin
3.13.0
${maven.compiler.release}
Then in each module, you only need to declare the plugin without extra configuration, or even skip it entirely if you just inherit it. I prefer pluginManagement so child modules can declare the plugin by name and get the consistent config without duplication.
When to use profiles for different Java versions
Sometimes you truly need different Java versions for different build scenarios. I see this in libraries that support both Java 11 and Java 21, or in services that must run in multiple environments. Maven profiles can help, but you should use them sparingly because they complicate the build.
Here is a profile‑based setup that targets Java 11 by default and Java 21 when a profile is activated:
11
java21
21
You can activate it with:
mvn -Pjava21 package
How I decide whether to use profiles
I only use profiles when I must maintain multiple binaries. If you are building a single application that runs in one environment, keep it simple and declare one version. Profiles are most useful for library authors who need to publish artifacts for multiple runtime baselines or for applications that have staging vs production discrepancies (which I usually try to remove).
Common mistakes and how I avoid them
These are the issues I see most often when people first try to lock down the Java version in Maven.
Mistake 1: Setting only source and forgetting target
If you set source but not target, your code might compile with the desired language level but generate bytecode that’s too new for your runtime. I avoid this by using release or setting both source and target when I must.
Mistake 2: Omitting plugin versions
If you don’t set a version for maven-compiler-plugin, you rely on whatever version Maven resolves by default, which may differ between machines or change when Maven itself updates. I pin the plugin version in every project.
Mistake 3: Assuming the build JDK equals the target JDK
Developers often believe that because they set release=17, Maven will automatically use JDK 17 to compile. It does not. It will use the running JDK, so if you run Maven with JDK 21 you may still compile with JDK 21. Toolchains fix this.
Mistake 4: Mixing modules with different Java targets
In multi-module builds, one module sets Java 11 and another sets Java 21. This can cause runtime linkage errors if modules interact. I enforce the version at the parent and ban overrides unless there’s a very strong reason.
Mistake 5: Not verifying the effective configuration
You should always confirm what Maven really used. I often run:
mvn help:effective-pom
Then I search the output for maven-compiler-plugin and confirm the release value. It takes 30 seconds and saves hours of debugging.
Practical examples with real-world scenarios
Below are scenarios I’ve handled and the approach I recommend, with concrete configuration.
Scenario A: Service runs on Java 17 but dev machines use Java 21
You want developers to enjoy new tooling but keep runtime compatibility with 17.
Recommended setup:
- Set
maven.compiler.releaseto 17 - Use toolchains to compile with JDK 17
- Allow Maven itself to run on 21 if desired
17
org.apache.maven.plugins
maven-compiler-plugin
3.13.0
${maven.compiler.release}
org.apache.maven.plugins
maven-toolchains-plugin
3.2.0
toolchain
17
Scenario B: Library supports Java 11 and Java 17
Here I typically produce two builds. One is the default Java 11 build, and the other is a profile for Java 17.
11
java17
17
Then I publish two artifacts (often with classifiers or separate modules). You should keep the default as the lowest supported runtime, because that creates the most compatible artifact.
Scenario C: Mixed Java/Kotlin or Java/Scala build
If you use Kotlin or Scala alongside Java, you must ensure the target bytecode version is consistent across all compilers. For example, Kotlin has jvmTarget and Scala has target or release equivalents. I make sure all compilers target the same level and verify their outputs. This avoids classfile mismatch errors at runtime.
Verification workflow I use on every project
I use a simple checklist to confirm Maven is compiling the way I expect.
1) Check the running JDK:
mvn -v
This tells me the JDK running Maven. If it’s not the JDK I expect, I update JAVA_HOME or rely on toolchains.
2) Inspect the effective POM:
mvn help:effective-pom
I search for maven-compiler-plugin and verify release.
3) Inspect the compiled bytecode:
javap -verbose target/classes/com/acme/App.class | grep "major version"
This should match the target you set (for example, 61 for Java 17). If it doesn’t, something in your build is overriding the configuration.
I consider step 3 essential when there are multiple modules or plugins involved.
Traditional vs modern approach in 2026
Here’s how I frame it when mentoring teams. I prefer the modern approach because it is clearer and more reliable over time.
Traditional (legacy)
—
source + target
release JAVA_HOME
Informal conventions
Hardcoded per module
“It builds on my machine”
effective-pom + bytecode check The modern approach takes a little more setup, but it saves far more time than it costs.
Enforcing Java versions with the Maven Enforcer Plugin
The Enforcer Plugin is my guardrail. It prevents accidental builds on the wrong JDK and helps keep teams aligned.
Here is a configuration I use often:
org.apache.maven.plugins
maven-enforcer-plugin
3.5.0
enforce-java
enforce
[17,)
true
This rule says the build requires Java 17 or higher to run Maven. That doesn’t necessarily enforce the compile target, but it prevents running the build with a JDK that cannot handle your chosen release value.
If you want a stricter rule, you can tie it to an exact version range, but I usually allow a minimum and let toolchains enforce the precise compiler version.
Handling preview features and new language levels
If you need preview features (for example, a Java 21 preview or a Java 23 preview), you must enable them in the compiler plugin and at runtime. Here’s how I do it in Maven:
org.apache.maven.plugins
maven-compiler-plugin
3.13.0
21
--enable-preview
You also need to pass --enable-preview when running tests or executing the application. I wire this into the Surefire or Failsafe plugins.
org.apache.maven.plugins
maven-surefire-plugin
3.2.5
--enable-preview
I only use preview features for experimental work. For production systems, I stick to stable releases because preview features can be removed or changed.
Edge cases: shading, annotation processors, and JPMS
Some plugins behave differently depending on the Java version. Here are the three places I check carefully.
Shading or bytecode manipulation
Tools like the Maven Shade Plugin or bytecode manipulators can break if their toolchain is not aligned with your target. Make sure they run with a compatible JDK and that they handle the target bytecode version. I keep the toolchain set to the target JDK to avoid surprises.
Annotation processors
Annotation processors compile against the JDK they run on. If your target is Java 17 but your processor uses a JDK 21 API, you will get confusing errors. I make sure processors are compatible with the target and that the processor path is stable.
Java Platform Module System (JPMS)
JPMS warnings or errors change across Java versions. If you build modular jars, keep the build JDK consistent and test on the runtime you ship. I also ensure my module-info.java is compiled with the correct release, because the module graph can differ across versions.
Performance considerations and build time impact
Setting a Java version in Maven does not directly slow down your build, but toolchains can add a small overhead when Maven resolves the JDK. In my experience, the difference is typically in the 10–30ms range for toolchain resolution, which is negligible compared to compilation time.
The real performance benefit is reduced build debugging time. Once the Java version is locked down, I see fewer CI failures caused by version drift. That’s a net time win for teams.
Modern tooling and AI-assisted workflows in 2026
My workflow in 2026 often includes AI-assisted checks that validate build configuration. I use tools that scan POM files and suggest consistent versions across plugins and modules. When AI tools propose changes, I still verify with effective-pom and javap because the actual compiled bytecode is the final truth.
If you use AI code assistants, ask them to propose a single source of truth property for the Java version and to align toolchains and Enforcer rules. It speeds up the routine work, but I keep the final verification manual because build correctness is non‑negotiable.
A concise “best default” setup I recommend
If you want one set of defaults you can drop into most projects, this is what I use. It assumes you target Java 17 and compile with toolchains.
17
UTF-8
org.apache.maven.plugins
maven-compiler-plugin
3.13.0
${maven.compiler.release}
org.apache.maven.plugins
maven-toolchains-plugin
3.2.0
toolchain
${maven.compiler.release}
org.apache.maven.plugins
maven-enforcer-plugin
3.5.0
enforce-java
enforce
[17,)
true
This gives you a stable baseline with minimal surprises. If you target Java 21, change the property and the toolchain will follow.
What not to do and why
There are a few approaches I recommend you avoid:
- Don’t rely on Maven defaults for the compiler plugin. Defaults change over time.
- Don’t set the Java version only in your IDE. Maven does not read IDE settings.
- Don’t hardcode the Java version in multiple places across modules. It will drift.
- Don’t silently override the compiler settings in one module; you’ll create compatibility problems.
Closing thoughts and your next steps
The most reliable builds I’ve seen all have one thing in common: a clearly defined Java version strategy that is enforced by Maven itself. If you only take one action after reading this, set the compiler release in your parent POM and pin the plugin version. That small change eliminates a huge class of build failures.
If you can go further, add toolchains and an Enforcer rule. This combination ensures the build JDK is correct, the bytecode target is correct, and Maven fails early when someone tries to build with the wrong setup. It’s the difference between “works on my machine” and “works everywhere.”
When you update Java versions, treat it as a versioned change to your build configuration. I usually handle it like a dependency upgrade: update the property, verify the effective POM, re-run tests, and inspect bytecode for a sanity check. That workflow sounds heavy, but in practice it takes only a few minutes and prevents days of confusion later.
If you want to extend this further, I recommend adding CI validation that checks the effective POM and the compiled class file version as part of your pipeline. That is the final guardrail that makes Java version drift almost impossible. Once you’ve done this a few times, it becomes routine, and your Maven builds become the most predictable part of your system.


