eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (cat=Spring Boot)
announcement - icon

Refactor Java code safely — and automatically — with OpenRewrite.

Refactoring big codebases by hand is slow, risky, and easy to put off. That’s where OpenRewrite comes in. The open-source framework for large-scale, automated code transformations helps teams modernize safely and consistently.

Each month, the creators and maintainers of OpenRewrite at Moderne run live, hands-on training sessions — one for newcomers and one for experienced users. You’ll see how recipes work, how to apply them across projects, and how to modernize code with confidence.

Join the next session, bring your questions, and learn how to automate the kind of work that usually eats your sprint time.

Course – LJB – NPI EA (cat = Core Java)
announcement - icon

Code your way through and build up a solid, practical foundation of Java:

>> Learn Java Basics

Partner – LambdaTest – NPI EA (cat= Testing)
announcement - icon

Distributed systems often come with complex challenges such as service-to-service communication, state management, asynchronous messaging, security, and more.

Dapr (Distributed Application Runtime) provides a set of APIs and building blocks to address these challenges, abstracting away infrastructure so we can focus on business logic.

In this tutorial, we'll focus on Dapr's pub/sub API for message brokering. Using its Spring Boot integration, we'll simplify the creation of a loosely coupled, portable, and easily testable pub/sub messaging system:

>> Flexible Pub/Sub Messaging With Spring Boot and Dapr

eBook – Java Streams – NPI (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

1. Overview

In this tutorial, we’ll learn about aggregating exceptions in a stream pipeline.

The Stream API in itself does not provide any declarative way to process exceptions. It has a single channel in the pipeline that processes the data, and there is no separate channel for processing exceptions. This means that it does not provide a way to invoke a function when it encounters an exception. Hence we have to fall back to catching exceptions with a try-catch block.

As a result, aggregating exceptions in a stream pipeline and handling them can be challenging.

2. Aggregating Exceptions With a Try Catch Block Within the Stream Pipeline

There are often cases where a method just has to be called for its effect, for example, a simple database update that might throw an exception due to connection failure. With this in mind, let’s consider a simple example of calling processThrowsExAndNoOutput() in the pipeline:

@Test
public void givenTryCatchInPipeline_whenFoundEx_thenSuppressExIntoRuntimeEx() {
    String[] strings = {"1", "2", "3", "a", "b", "c"};
    RuntimeException runEx = Arrays.stream(strings)
      .map(str -> {
          try {
              processThrowsExAndNoOutput(str);
              return null;
          } catch (RuntimeException e) {
              return e;
          }
      })
      .filter(Objects::nonNull)
      .collect(Collectors.collectingAndThen(Collectors.toList(), list -> {
          RuntimeException runtimeException = new RuntimeException("Errors Occurred");
          list.forEach(runtimeException::addSuppressed);
          return runtimeException;
      }));
    processExceptions(runEx);
    assertEquals("Errors Occurred", runEx.getMessage());
    assertEquals(3, runEx.getSuppressed().length);
}

In the above program, we treat caught exceptions as data in the stream. The map() method returns either null or the exception. With filter(), only exceptions are allowed downstream. Finally, we reduce it into a RuntimeException using addSuppressed(). We can then call processExceptions() to handle the aggregate exception.

That works! But could it be more declarative? Let’s work towards it in the upcoming sections.

3. Aggregating Exceptions by Extracting the Try Catch Block Into a Method

Let’s make the implementation a little more readable and concise. To do that, we move the try-catch block into a method:

static Throwable callProcessThrowsExAndNoOutput(String input) {
    try {
        processThrowsExAndNoOutput(input);
        return null;
    } catch (RuntimeException e) {
        return e;
    }
}

Now, we can call the above method from inside the pipeline:

@Test
public void givenExtractedMethod_whenFoundEx_thenSuppressExIntoRuntimeEx() {
    String[] strings = {"1", "2", "3", "a", "b", "c"};
    RuntimeException runEx = Arrays.stream(strings)
      .map(str -> callProcessThrowsExAndNoOutput(str))
      .filter(Objects::nonNull)
      .reduce(new RuntimeException("Errors Occurred"), (o1, o2) -> {
          o1.addSuppressed(o2);
          return o1;
      });
    // handle the aggregate exception as before
}

The above approach looks cleaner. However, there is still room for improvement and more use cases to discuss.

4. Aggregating Exceptions and Output in the Stream Pipeline Using Reflection

Most programs have to handle both exceptions and expected output. Let’s take an example of a method which can return either an exception or some output:

static Object processReturnsExAndOutput(String input) {
    try {
        return Integer.parseInt(input);
    } catch (Exception e) {
        return new RuntimeException("Exception in processReturnsExAndOutput for " + input, e);
    }
}

Now, let’s look at the pipeline processing:

@Test
public void givenProcessMethod_whenStreamResultHasExAndOutput_thenHandleExceptionListAndOutputList() {
    List<String> strings = List.of("1", "2", "3", "a", "b", "c");
    Map map = strings.stream()
      .map(s -> processReturnsExAndOutput(s))
      .collect(Collectors.partitioningBy(o -> o instanceof RuntimeException, Collectors.toList()));
    
    List<RuntimeException> exceptions = (List<RuntimeException>) map.getOrDefault(Boolean.TRUE, List.of());
    List<Integer> results = (List<Integer>) map.getOrDefault(Boolean.FALSE, List.of());
    handleExceptionsAndOutputs(exceptions, results);
}

The above stream pipeline uses partitioningBy() in the terminal collect(). It makes use of reflection to partition the results into a list of exceptions and integers. Further down, the program calls handleExceptionsAndOutputs() to take care of exceptions and the output for further processing.

This time, we didn’t reduce the exceptions into an aggregate RuntimeException. Instead, we passed on the list of exceptions for further processing. This is another way of aggregating the exceptions.

As we can see, it is definitely not the cleanest of the approaches, with raw types and casting required. Hence, the upcoming sections will explore more generalized solutions to address the issue at hand.

4. Aggregating Exceptions and Output Using a Custom Mapper

Going forward, we are going to focus more on functional programming.

We’ll create a custom mapper function that wraps another map() stream function. It returns a Result object which encapsulates both the result and the exception.

First, let’s look at the Result class:

public class Result<R, E extends Throwable> {
    private Optional<R> result;
    private Optional<E> exception;

    public Result(R result) {
        this.result = Optional.of(result);
        this.exception = Optional.empty();
    }

    public Result(E exception) {
        this.exception = Optional.of(exception);
        this.result = Optional.empty();
    }

    public Optional<R> getResult() {
        return result;
    }

    public Optional<E> getException() {
        return exception;
    }
}

The Result class uses Generics and Optional. Since result and exception can hold either a null or a non-null value, we have used Optional. Its usage will become clearer as we further move on.

We discussed the custom mapper at the beginning of this section, Let’s now look at its implementation:

public class CustomMapper {
    public static <T, R> Function<T, Result<R, Throwable>> mapper(Function<T, R> func) {
        return arg -> {
            try {
                return new Result(func.apply(arg));
            } catch (Exception e) {
                return new Result(e);
            }
        };
    }
}

Now it is time to see the mapper() in action:

@Test
public void givenCustomMapper_whenStreamResultHasExAndSuccess_thenHandleExceptionListAndOutputList() {
    List<String> strings = List.of("1", "2", "3", "a", "b", "c");
    strings.stream()
      .map(CustomMapper.mapper(Integer::parseInt))
      .collect(Collectors.collectingAndThen(Collectors.toList(),
        list -> handleErrorsAndOutputForResult(list)));
}

This time we used Collectors.CollectingAndThen() to invoke handleErrorsAndOutputForResult() at the end of the pipeline with the list of Result<Integer, Throwable> objects. Let’s take a look at handleErrorsAndOutputForResult():

static String handleErrorsAndOutputForResult(List<Result<Integer, Throwable>> successAndErrors) {
    logger.info("handle errors and output");
    successAndErrors.forEach(result -> {
        if (result.getException().isPresent()) {
            logger.error("Process Exception " + result.getException().get());
        } else {
            logger.info("Process Result" + result.getResult().get());
        }
    });
    return "Errors and Output Handled";
}

As shown above, we simply iterate over the Result list and fork into a success or failure flow with the help of the method Optional.isPresent(). This can be a useful approach when the success and error cases have to be dealt with distinctly, e.g. sending notifications to separate users.

When the function to be used inside Stream.map() cannot be modified, for example, because it is from an external library, we can use our custom mapper() function to wrap it and handle the outcome in a more generalized manner.

4. Aggregate Exceptions and Output Using a Custom Collector

Aggregating the exceptions and output of a pipeline is a kind of collection activity. Hence it makes sense to implement a collector, which is designed for this purpose.

Let’s see how to do that:

public class CustomCollector<T, R> {
    private final List<R> results = new ArrayList<>();
    private final List<Throwable> exceptions = new ArrayList<>();

    public static <T, R> Collector<T, ?, CustomCollector<T, R>> of(Function<T, R> mapper) {
        return Collector.of(
          CustomCollector::new,
          (collector, item) -> {
              try {
                  R result = mapper.apply(item);
                  collector.results.add(result);
              } catch (Exception e) {
                  collector.exceptions.add(e);
              }
          },
          (left, right) -> {
              left.results.addAll(right.results);
              left.exceptions.addAll(right.exceptions);
              return left;
          }
        );
    }
    // standard getters...
}

Finally, let’s take a look at how the collector exactly works:

@Test
public void givenCustomCollector_whenStreamResultHasExAndSuccess_thenHandleAggrExceptionAndResults() {
    String[] strings = {"1", "2", "3", "a", "b", "c"};
    Arrays.stream(strings)
      .collect(Collectors.collectingAndThen(CustomCollector.of(Integer::parseInt),
        col -> handleExAndResults(col.getExceptions(), col.getResults())));
}

5. Aggregating Exceptions and Output Using Try and Either From Vavr Library

Try is a container that holds either an uncaught exception or the actual output in case of success. Just like the custom mapper discussed earlier, Try can also wrap functions.

Whereas, Either is a more general container that holds either an error type or the expected output type.

Let’s see how we can exploit these features together:

@Test
public void givenVavrEitherAndTry_whenStreamResultHasExAndSuccess_thenHandleExceptionListAndOutputList() {
    List<String> strings = List.of("1", "2", "3", "a", "b", "c");
    strings.stream()
      .map(str -> Try.of(() -> Integer.parseInt(str)).toEither())
      .collect(Collectors.collectingAndThen(Collectors.partitioningBy(Either::isLeft, Collectors.toList()),
        map -> handleErrorsAndOutputForEither(map)));
}

As we can see, the program converts the Try object into an Either and then collects it into a map to invoke handleErrorsAndOutputForEither():

static void handleErrorsAndOutputForEither(Map<Boolean, List<Either<Throwable, Integer>>> map) {
    logger.info("handle errors and output");
    map.getOrDefault(Boolean.TRUE, List.of())
      .forEach(either -> logger.error("Process Exception " + either.getLeft()));
    map.getOrDefault(Boolean.FALSE, List.of())
      .forEach(either -> logger.info("Process Result " + either.get()));
}

Further, as shown above, the exceptions and output can be processed by swiping left or right on the Either object. As we can see, the Try and Either approach provides us with the most concise solution that we have seen today.

6. Conclusion

In this tutorial, we explored a few ways to aggregate runtime exceptions while processing a stream. While many approaches are possible, it is important to maintain the essence of stream processing, including conciseness, immutability, and declarative syntax.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (tag=Refactoring)
announcement - icon

Modern Java teams move fast — but codebases don’t always keep up. Frameworks change, dependencies drift, and tech debt builds until it starts to drag on delivery. OpenRewrite was built to fix that: an open-source refactoring engine that automates repetitive code changes while keeping developer intent intact.

The monthly training series, led by the creators and maintainers of OpenRewrite at Moderne, walks through real-world migrations and modernization patterns. Whether you’re new to recipes or ready to write your own, you’ll learn practical ways to refactor safely and at scale.

If you’ve ever wished refactoring felt as natural — and as fast — as writing code, this is a good place to start.

eBook – Java Streams – NPI (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook Jackson – NPI EA – 3 (cat = Jackson)