When I review Java codebases in 2026, the stream forEach() call is still one of the most common terminal operations I see. It looks simple, but it hides a lot of behavior that can surprise you if you treat it like a plain for loop. I have watched teams accidentally reorder output in parallel streams, leak shared state, and even slow down fast pipelines because they used forEach() in the wrong place. If you want your stream pipelines to stay clear and correct, you need to understand what forEach() does, what it does not do, and how it behaves under ordering and parallelism. In this post I walk through the core semantics, show concrete examples, and point out mistakes I keep fixing in reviews. You will see how to make your Consumer actions safe, when to favor forEachOrdered(), and how to keep side effects under control. I will also compare the classic loop style with modern stream style using a clean table so you can choose the right tool for your case.
Why I reach for forEach, and when I do not
I use forEach() when I want to trigger a side effect at the end of a stream pipeline, such as writing to a log, pushing data to a cache, or printing to the console in a small utility. The key idea is that forEach() is a terminal operation: it consumes the stream and applies a Consumer action to each element. That is exactly what you want when the pipeline ends in a side effect.
But I do not use forEach() when I want to build a new value. If my goal is to produce a collection, sum, or grouped map, I go with collect(), reduce(), toList(), or other terminal operations that return data. The difference matters because forEach() encourages side effects and shared state, and those two things are the most common source of subtle bugs in stream code. In my experience, stream code stays healthy when the terminal operation returns a value, and side effects are kept small and explicit.
Here is the key mindset I use: forEach() is for finishing, not for transforming. The transformation should happen earlier in the pipeline with map, filter, flatMap, and friends. When I see a pipeline that uses forEach() to mutate external state, I start looking for a better alternative.
The actual contract of forEach() in Java
The method signature is:
void forEach(Consumer action)
A few details in that signature are more important than they look:
- It takes a
Consumer, which returns no value. That is a strong signal thatforEach()is meant for side effects. - It is terminal, which means it triggers evaluation of the pipeline. All intermediate operations before it are lazy until this call.
- It has no defined return value, so you should not expect a new stream, collection, or summary out of it.
The stream API guarantees that the action is invoked once for each element, but the order depends on the stream. For sequential streams, the action follows encounter order when the stream has one. For parallel streams, the action can run concurrently in multiple threads, and the order is not guaranteed unless you use forEachOrdered().
I find it helpful to think of forEach() as a last-mile executor. It applies the action and then the stream is done. If you need to do more work, you need to build that into the pipeline or use a different terminal operation.
A simple, runnable example I still show new hires
Even with years of experience, I like to start with a tiny example. It demonstrates the basic flow: create a stream, apply terminal action.
import java.util.Arrays;
import java.util.List;
public class Main {
public static void main(String[] args) {
List numbers = Arrays.asList(1, 2, 3, 4, 5);
// Print each number in the list using forEach
numbers.stream().forEach(System.out::print);
}
}
Output:
12345
This code is short, but it already shows two facts: the stream is created from a list, and forEach() is the terminal operation that causes the elements to be processed.
When ordering matters: forEach() vs forEachOrdered()
Ordering is the point where many stream beginners get bitten. The key rule: forEach() does not guarantee order for parallel streams. If you need to preserve encounter order, use forEachOrdered() on a parallel stream or stay sequential.
Here is a compact example. Try switching between forEach() and forEachOrdered() and compare the output.
import java.util.stream.IntStream;
public class ParallelOrderDemo {
public static void main(String[] args) {
// Parallel stream prints in non-deterministic order
IntStream.rangeClosed(1, 10)
.parallel()
.forEach(i -> System.out.print(i + " "));
System.out.println();
// Parallel stream with ordering preserved
IntStream.rangeClosed(1, 10)
.parallel()
.forEachOrdered(i -> System.out.print(i + " "));
}
}
This is not a mere cosmetic detail. If you are writing to a file or sending messages downstream, ordering can change the behavior of your system. I have seen subtle data corruption appear only in production because tests ran sequentially.
My rule is simple: if the order is part of your requirement, make it explicit. Do not trust forEach() on a parallel stream.
Traditional loops vs modern streams: a clear comparison
I still use loops when they make the intent obvious. But I like streams when the pipeline tells a story: filter, map, sort, then finish. Here is a table I use in code reviews to help teams decide quickly.
Traditional Loop
—
Clear and fast
Verbose, more state
forEach() at end Manual thread handling
parallel() + forEachOrdered() Loop with mutation
collect(toList()) is safer Easy to slip in
I recommend streams when the transformations are the main story and you want a readable pipeline. I recommend loops when you need tight control over state or performance and the operation is simple.
Example: reverse sorting and printing integers
This is a classic example I use to show how sorted() and forEach() work together.
import java.util.Arrays;
import java.util.Comparator;
import java.util.List;
class Demo {
public static void main(String[] args) {
List list = Arrays.asList(2, 4, 6, 8, 10);
// Sort in reverse order and print each element
list.stream()
.sorted(Comparator.reverseOrder())
.forEach(System.out::println);
}
}
Output:
10
8
6
4
2
The key lesson: the ordering change is part of the pipeline, while forEach() just applies the final action.
Example: printing a stream of strings
This is the simplest form of forEach() for a stream of text.
import java.util.Arrays;
import java.util.List;
class Demo {
public static void main(String[] args) {
List list = Arrays.asList("Atlas", "River", "Luna", "Nimbus");
// Print each string in the stream
list.stream().forEach(System.out::println);
}
}
Output:
Atlas
River
Luna
Nimbus
I tend to keep examples like this when teaching because they keep the mental load low and show the basic flow without any distraction.
Example: extract a character after reverse sort
This example combines sorted(), flatMap(), and forEach() to show a more interesting flow. It is a good test for your understanding of how a stream pipeline composes.
import java.util.Comparator;
import java.util.stream.Stream;
class Demo {
public static void main(String[] args) {
Stream stream = Stream.of("Orion", "Mosaic", "Fable", "Aria");
// Sort strings in reverse order, extract character at index 1, and print
stream.sorted(Comparator.reverseOrder())
.flatMap(str -> Stream.of(str.charAt(1)))
.forEach(System.out::println);
}
}
Output:
r
a
a
r
Notice the subtle behavior: each string is sorted, then each string yields a single character that is printed. flatMap() is used here because we are turning each string into a stream of one character. That is a bit heavy for a single character, but it demonstrates how a pipeline can change element shape.
Real-world patterns I see in production
In real services, I see forEach() used in a few recurring patterns. I will show you how I structure these so you can avoid common mistakes.
Logging events for a pipeline
When you have a stream of domain objects and you want to log something about each item at the end of a pipeline, forEach() is a clean option.
import java.util.List;
class Demo {
public static void main(String[] args) {
List orders = List.of(
new Order("A-100", 125.50),
new Order("A-101", 19.99),
new Order("A-102", 430.00)
);
orders.stream()
.filter(order -> order.total() >= 100.00)
.forEach(order -> {
// Keep side effects small and obvious
System.out.println("High-value order: " + order.id());
});
}
record Order(String id, double total) {}
}
I do not mutate the order, and I do not touch shared state. This is the safe style I recommend.
Pushing metrics to a counter
If you use a metrics system, you might want to count or tag items. This is safe as long as your metrics client is thread-safe.
import java.util.List;
class Demo {
public static void main(String[] args) {
List tags = List.of("api", "db", "cache", "api");
tags.stream()
.filter(tag -> tag.equals("api"))
.forEach(tag -> Metrics.counter("requestsbytag").increment());
}
// Minimal stub for example
static class Metrics {
static Counter counter(String name) { return new Counter(); }
}
static class Counter {
void increment() { / side effect / }
}
}
In a parallel stream, I would only do this if the metrics client is safe for concurrent calls. If it is not, I keep the stream sequential or use a different pattern.
Writing to an external system
If your forEach() writes to a database, message queue, or file, you must consider backpressure and error handling. I prefer to batch upstream and use explicit error handling rather than a bare forEach() because exceptions inside the action will short-circuit the entire stream.
Common mistakes I fix in reviews
Here are the mistakes I see most often, with a better way to handle each.
Mistake 1: Mutating shared state
This is the classic anti-pattern:
import java.util.ArrayList;
import java.util.List;
class Demo {
public static void main(String[] args) {
List numbers = List.of(1, 2, 3, 4, 5);
List doubled = new ArrayList();
numbers.stream().forEach(n -> doubled.add(n * 2));
System.out.println(doubled);
}
}
This works for a sequential stream, but it is risky and brittle. If anyone switches to parallel, you now have a race condition. The safer approach is to return a new list:
import java.util.List;
import java.util.stream.Collectors;
class Demo {
public static void main(String[] args) {
List numbers = List.of(1, 2, 3, 4, 5);
List doubled = numbers.stream()
.map(n -> n * 2)
.collect(Collectors.toList());
System.out.println(doubled);
}
}
Mistake 2: Using forEach() when you need a result
If you want to build a map, count elements, or create a summary, forEach() is not the right terminal operation. Use collect, count, max, min, or reduce.
Mistake 3: Assuming order in parallel streams
I see code that relies on ordering even with parallel() in the pipeline. That is a recipe for flaky tests. Use forEachOrdered() or drop parallel().
Mistake 4: Hiding exceptions inside the Consumer
If the action in forEach() can throw a checked exception, people often wrap it in a runtime exception. That makes debugging harder. I prefer to handle exceptions explicitly in the pipeline or use a helper that keeps errors visible.
When I recommend forEach(), and when I do not
Here is the guidance I give teams:
Use forEach() when:
- You need a terminal side effect that is simple and safe.
- Your stream is sequential, or the action is thread-safe.
- You want to log, print, emit metrics, or call a simple API.
Avoid forEach() when:
- You need to produce a collection or aggregated result.
- The action mutates shared state.
- Ordering matters but you are running in parallel.
- Error handling must be strict and recoverable.
I want you to see forEach() as a tool, not a default. It is safe and clean when used with intent, but it is not a general replacement for loops or collection builders.
Performance notes from real systems
Streams are not slow by default, but they are not free. forEach() is efficient for simple operations, and it can be very fast for in-memory lists where the pipeline is short. In typical service code, I see overhead in the range of low single-digit milliseconds for small collections, and it scales with the cost of the Consumer. For large pipelines, the overhead is less important than the cost of the work inside forEach().
Parallel streams can help for CPU-bound tasks with large data, but they can hurt if the Consumer is I/O bound or contends on shared resources. If your action writes to a database, parallel forEach() often makes things slower and more fragile.
I test performance with realistic data sizes, not microbenchmarks, and I measure total request latency rather than only stream speed. That keeps the focus on user impact.
A modern take: combining forEach() with 2026 tooling
In 2026, most teams I work with use AI-assisted IDEs and static analysis tools to catch risky patterns in forEach() use. These tools can highlight shared mutable state or a Consumer that is not thread-safe. I treat those warnings seriously because they often predict production issues.
I also see more teams using structured logging and telemetry libraries. When you use forEach() to emit metrics or logs, those clients are usually thread-safe, which makes parallel use safer. Still, I prefer to keep the stream sequential unless there is a clear performance reason to go parallel.
Virtual threads also changed how we write concurrent code, but they do not change stream ordering rules. forEach() still behaves exactly as defined by the stream’s encounter order and parallel execution policy.
Deep dive: encounter order and why it matters
I want to slow down here because encounter order is the hidden axis behind many stream bugs. A stream has encounter order if its source has order (like a List) or if an intermediate operation imposes order (like sorted()). If the stream is ordered and sequential, forEach() will follow that order. If the stream is unordered or parallel, you lose that guarantee.
This is why the same forEach() call can behave differently depending on the stream source:
List→ ordered by indexSet(likeHashSet) → no encounter orderMapentry set → depends on map type (LinkedHashMappreserves insertion order,HashMapdoes not)Stream.generate()→ no encounter order unless you explicitly order later
Here is a concrete example showing how source order changes what you see:
import java.util.*;
public class EncounterOrderDemo {
public static void main(String[] args) {
List list = List.of("A", "B", "C");
Set set = new HashSet(list);
list.stream().forEach(System.out::print);
System.out.println();
set.stream().forEach(System.out::print);
System.out.println();
}
}
The list always prints ABC, while the set may print CBA or another order depending on hashing. When order matters, choose the right source or impose order with sorted() before the terminal operation.
Deeper example: stream pipeline with validation and side effects
Here is a more realistic pipeline I see in business services. It filters invalid inputs, transforms the values, and then performs a side effect (persisting a record). I keep the transformation pure and move the side effect to the end.
import java.util.List;
import java.util.Locale;
public class UserSync {
public static void main(String[] args) {
List rawNames = List.of(" anna ", "", "BoB", " ", "Clara");
rawNames.stream()
.map(String::trim)
.filter(name -> !name.isEmpty())
.map(name -> name.substring(0, 1).toUpperCase(Locale.ROOT) + name.substring(1).toLowerCase(Locale.ROOT))
.forEach(UserSync::persistName);
}
static void persistName(String name) {
// Side effect at the terminal stage
System.out.println("Persisting: " + name);
}
}
This pattern keeps the transformation stages clean, while still letting forEach() do what it does best: drive the side effect at the end.
Edge cases: nulls, exceptions, and short-circuiting
forEach() does not ignore null values. If your stream contains null elements, your Consumer must handle them or you must filter them out. That is another reason I like to keep a filter(Objects::nonNull) early in pipelines that might include nulls.
Exceptions inside forEach() are terminal in the most literal sense: they terminate the pipeline. If your Consumer throws, the stream stops and the exception propagates to the caller. That is not always wrong, but it surprises teams when they expect “best effort” processing.
If you want best effort behavior, you have to encode it:
import java.util.List;
public class SafeForEach {
public static void main(String[] args) {
List files = List.of("a.txt", "bad.txt", "c.txt");
files.stream().forEach(file -> {
try {
process(file);
} catch (Exception e) {
// Log and continue
System.err.println("Failed: " + file + " -> " + e.getMessage());
}
});
}
static void process(String file) {
if (file.equals("bad.txt")) {
throw new IllegalStateException("Corrupt file");
}
System.out.println("Processed: " + file);
}
}
I only recommend this style when you truly want to continue after failures. Otherwise, let the exception bubble and fix the root cause upstream.
forEach() vs peek(): similar vibe, different intent
peek() is another stream method people misuse. It looks like forEach(), but it is intermediate, not terminal. I use peek() only for debugging or lightweight instrumentation, never for production side effects.
Here is the golden rule I enforce:
peek()is for diagnostics and should not change program state.forEach()is for real side effects and should be used at the end.
If you use peek() to call a database or update a cache, you are building a time bomb. The stream could be optimized, short-circuited, or even not executed at all, and your side effects will silently vanish. forEach() is explicit and reliable; peek() is not.
forEach() in parallel: what stays safe and what breaks
Parallel streams are not inherently unsafe, but they make a few patterns dangerous. Here is my mental model:
Safe in parallel forEach():
- Stateless, idempotent actions (logging to a thread-safe logger)
- Metrics updates using thread-safe counters
- Pure transformations earlier in the pipeline
Risky in parallel forEach():
- Mutating shared collections or objects
- Calling non-thread-safe services or clients
- Relying on output order or timing
Here is a concrete example of safe parallel use with a thread-safe accumulator:
import java.util.List;
import java.util.concurrent.atomic.LongAdder;
public class ParallelCounter {
public static void main(String[] args) {
List numbers = List.of(1,2,3,4,5,6,7,8,9,10);
LongAdder sum = new LongAdder();
numbers.parallelStream().forEach(sum::add);
System.out.println("Sum: " + sum.sum());
}
}
Even here, I ask whether sum() should be a terminal reduction instead. A cleaner approach is:
int sum = numbers.stream().mapToInt(Integer::intValue).sum();
The second version is clearer and avoids side effects entirely.
Alternative approaches that replace forEach()
I keep a short list of alternatives that reduce side effects and are usually safer.
Use collect() to build collections
List names = rawNames.stream()
.map(String::trim)
.filter(s -> !s.isEmpty())
.toList();
Use reduce() or summary operations
int total = numbers.stream().mapToInt(Integer::intValue).sum();
Use joining() for string concatenation
String csv = values.stream().map(String::valueOf).collect(Collectors.joining(","));
Use groupingBy() for aggregation
Map<String, List> byCustomer = orders.stream()
.collect(Collectors.groupingBy(Order::customerId));
When a pipeline’s purpose is to compute a value, these terminal operations are almost always the better choice.
Practical scenario: report generation pipeline
Let me show you a scenario I see in real systems: generate a report of high-value orders and then write it to a file. The danger is mixing transformation and I/O.
Step 1: Build data safely
List reportLines = orders.stream()
.filter(o -> o.total() > 1000)
.sorted(Comparator.comparing(Order::total).reversed())
.map(o -> o.id() + "," + o.total())
.toList();
Step 2: Perform side effect explicitly
reportLines.forEach(line -> fileWriter.write(line + "\n"));
I separate the transformation and the side effect. This makes it easier to test the transformation and easier to handle file errors at the very end.
Practical scenario: cache warm-up with safe batching
forEach() can be risky when it calls a cache or remote API. I prefer to batch so I can control backpressure and error handling.
import java.util.List;
import java.util.stream.Collectors;
public class CacheWarmup {
public static void main(String[] args) {
List keys = List.of("k1", "k2", "k3", "k4", "k5", "k6");
List<List> batches = partition(keys, 2);
batches.forEach(batch -> {
try {
warmCache(batch);
} catch (Exception e) {
System.err.println("Batch failed: " + batch);
}
});
}
static void warmCache(List keys) {
System.out.println("Warming keys: " + keys);
}
static List<List> partition(List list, int size) {
return java.util.stream.IntStream.range(0, (list.size() + size - 1) / size)
.mapToObj(i -> list.subList(i size, Math.min(list.size(), (i + 1) size)))
.collect(Collectors.toList());
}
}
This uses a simple partition step to control how much work happens per forEach() action. The side effect is still explicit and isolated.
Common pitfall: using forEach() for indexing
Another habit I see is using forEach() just to track indexes. That creates mutable counters and complicates parallel usage. A stream-friendly approach is to use IntStream to generate indexes safely:
import java.util.List;
import java.util.stream.IntStream;
public class IndexingDemo {
public static void main(String[] args) {
List list = List.of("alpha", "beta", "gamma");
IntStream.range(0, list.size())
.forEach(i -> System.out.println(i + ": " + list.get(i)));
}
}
The intent is clearer, and you avoid a mutable counter. If you must keep order, stay sequential.
Common pitfall: mixing parallel streams with non-thread-safe clients
This is the failure mode that appears most in production postmortems. Developers enable parallel() to speed up a pipeline, but the forEach() action uses a client that is not thread-safe.
If the client is not thread-safe, you have three options:
1) Keep the stream sequential.
2) Use a thread-safe wrapper or pool of clients.
3) Move the side effect outside of the stream entirely.
I rarely recommend option 2 unless the client library explicitly supports it. Option 1 is often the safest. Option 3 is the cleanest if you can design it.
forEach() with Optional and flatMap: avoiding nulls
Here is a simple pattern I use to avoid nulls and still keep a clean pipeline:
import java.util.List;
import java.util.Optional;
public class OptionalForEach {
public static void main(String[] args) {
List raw = List.of("Alice", "", "Bob", " ");
raw.stream()
.map(String::trim)
.map(OptionalForEach::toOptional)
.flatMap(Optional::stream)
.forEach(System.out::println);
}
static Optional toOptional(String s) {
return s.isEmpty() ? Optional.empty() : Optional.of(s);
}
}
flatMap(Optional::stream) is a clean way to drop empty values without introducing nulls, and forEach() stays focused on output.
forEach() and resource management
When a stream is backed by an I/O resource (like lines from a file), forEach() won’t automatically close it unless you use try-with-resources. Always wrap stream creation when the source is closeable.
import java.nio.file.Files;
import java.nio.file.Path;
import java.io.IOException;
public class FileStreamDemo {
public static void main(String[] args) throws IOException {
Path path = Path.of("data.txt");
try (var lines = Files.lines(path)) {
lines.filter(line -> !line.isBlank())
.forEach(System.out::println);
}
}
}
This is not about forEach() directly, but the terminal operation will trigger the stream consumption, so it is where you will see errors and resource usage appear.
Debugging forEach() pipelines
When a pipeline fails, developers often blame forEach() because the exception happens there. In reality, the error could originate in any earlier transformation. Here is how I debug quickly:
1) Replace forEach() with peek() temporarily to locate the last “good” stage.
2) Add small map() validations to assert assumptions.
3) Use a test input that is small but representative.
4) If parallel, switch to sequential for deterministic debugging.
A quick debug example:
items.stream()
.map(this::transform)
.peek(x -> System.out.println("After transform: " + x))
.map(this::validate)
.forEach(this::send);
Once the issue is found, I remove peek() and restore clean code.
A practical comparison: loop vs stream with forEach()
Here is a direct comparison for a real task: filter active users, normalize emails, and send notifications. I show both versions in reviews so teams can decide which reads better.
Loop version
List emails = new ArrayList();
for (User u : users) {
if (!u.active()) continue;
emails.add(u.email().trim().toLowerCase());
}
for (String email : emails) {
notifier.send(email);
}
Stream version
users.stream()
.filter(User::active)
.map(u -> u.email().trim().toLowerCase())
.forEach(notifier::send);
I prefer the stream version when the pipeline is short and clear. If the logic becomes complex or the action requires complex error handling, I switch to loops.
forEach() with custom Consumer implementations
Sometimes you want to package the action into a reusable Consumer so it can be tested or reused in multiple pipelines.
import java.util.function.Consumer;
public class CustomConsumerDemo {
public static void main(String[] args) {
Consumer emailSender = email -> {
System.out.println("Sending to: " + email);
};
List.of("[email protected]", "[email protected]")
.stream()
.forEach(emailSender);
}
}
This is a small pattern, but it keeps your forEach() lines clean and makes testing easier.
Subtle behavior: forEach() with short-circuiting operations
Stream pipelines can short-circuit with operations like findFirst(), anyMatch(), or limit(). forEach() itself is not short-circuiting, but earlier operations can change how many elements reach it.
Example:
IntStream.range(1, 100)
.filter(n -> n % 2 == 0)
.limit(5)
.forEach(System.out::println);
Only five even numbers reach forEach(), even though the source has many more. This is correct, but it surprises people who expect the forEach() action to touch everything.
forEachOrdered(): when order is a feature, not a cost
forEachOrdered() makes sense when order is essential, but it can reduce parallel speedups because it forces ordering in the terminal stage. I use it when:
- The output is a file that must preserve order.
- The downstream system expects ordered events.
- A human reads the output and ordering matters for comprehension.
If order is not part of your requirement, avoid forEachOrdered() and accept unordered parallel execution for better throughput.
Handling checked exceptions in forEach()
Java checked exceptions are a pain in forEach() because Consumer doesn’t allow them. I use one of two approaches:
Approach 1: Wrap and rethrow with context
list.stream().forEach(item -> {
try {
risky(item);
} catch (IOException e) {
throw new UncheckedIOException("Failed on: " + item, e);
}
});
Approach 2: Collect failures for reporting
List failures = new ArrayList();
list.stream().forEach(item -> {
try {
risky(item);
} catch (IOException e) {
failures.add(item);
}
});
if (!failures.isEmpty()) {
System.err.println("Failures: " + failures);
}
I only use Approach 2 when best-effort is acceptable; otherwise I fail fast with an unchecked exception.
A note on side effects and testing
forEach() makes testing harder because it doesn’t return a value. When you can, separate the transformation from the side effect so the transformation can be tested as a pure function. I often write tests for the pipeline up to toList() and then keep forEach() trivial.
Example testable pipeline:
List output = users.stream()
.filter(User::active)
.map(User::email)
.toList();
Then I test output directly without involving side effects.
Performance considerations: before/after comparisons
I avoid precise numbers because performance is environment-specific, but here are the patterns I see:
- For small collections (tens to hundreds), a loop is usually marginally faster and simpler.
- For medium collections (thousands), a stream can be comparable and more readable if the pipeline is clean.
- For large collections (hundreds of thousands and up), parallel streams can help only if the
forEach()action is CPU-bound and thread-safe.
The biggest performance wins usually come from reducing work in the Consumer, not from switching loop vs stream. If your action does heavy I/O, optimize that first.
Comparison table: forEach() vs alternatives
Here is a more targeted comparison I keep in my notes:
Preferred Terminal Operation
—
forEach()
forEachOrdered()
collect()
count(), sum()
findFirst(), findAny()
If your terminal operation returns a value, it is usually easier to test and reason about than forEach().
A complete, practical example: order processing pipeline
Let me combine the ideas into a longer example. This version validates, transforms, and then emits an event per order. I keep side effects last and handle errors explicitly.
import java.util.List;
import java.util.Locale;
public class OrderPipeline {
public static void main(String[] args) {
List orders = List.of(
new Order("A-100", 125.50, "USD"),
new Order("A-101", -10.00, "USD"),
new Order("A-102", 99.99, "USD"),
new Order("A-103", 500.00, "EUR")
);
orders.stream()
.filter(o -> o.total() > 0)
.map(OrderPipeline::normalizeCurrency)
.filter(o -> o.total() >= 100)
.forEach(OrderPipeline::emitEvent);
}
static Order normalizeCurrency(Order order) {
if (order.currency().equals("EUR")) {
// pretend conversion
return new Order(order.id(), order.total() * 1.1, "USD");
}
return order;
}
static void emitEvent(Order order) {
System.out.println("Emit event for: " + order.id() + " total=" + order.total());
}
record Order(String id, double total, String currency) {}
}
This style reads like a story: validate, normalize, filter, then emit. forEach() stays in the role it was designed for.
Practical checklist I use before approving forEach() code
When I review a PR that uses forEach(), I ask these questions:
1) Is the action pure and safe? If not, can we return a value instead?
2) Is the stream parallel? If yes, is the action thread-safe and order-independent?
3) Does the source have encounter order? If order matters, is that explicit?
4) Are exceptions handled in a way that matches the product requirement?
5) Could a clearer terminal operation replace this?
If all five checks pass, I almost always approve forEach() usage.
Bringing it all together
forEach() is deceptively simple, and that is exactly why it deserves careful use. It is a terminal operation meant to run a side effect for each element, nothing more. When you keep transformations earlier in the pipeline and make side effects explicit at the end, your stream code stays readable, testable, and safe. When you use forEach() as a stand-in for collecting results or mutating shared state, you invite subtle bugs, especially under parallel execution.
My practical advice is consistent: use forEach() when you truly need a side effect, keep the action small and thread-safe, and switch to forEachOrdered() when order is part of the requirement. If the goal is to build a result, use collect() or other terminal operations that return values. Streams shine when their intent is clear, and forEach() is at its best when it finishes a pipeline rather than pretending to be a loop.
If you apply these patterns and keep the pitfalls in mind, forEach() becomes a reliable tool rather than a hidden risk in your codebase.


