eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (cat=Spring Boot)
announcement - icon

Refactor Java code safely — and automatically — with OpenRewrite.

Refactoring big codebases by hand is slow, risky, and easy to put off. That’s where OpenRewrite comes in. The open-source framework for large-scale, automated code transformations helps teams modernize safely and consistently.

Each month, the creators and maintainers of OpenRewrite at Moderne run live, hands-on training sessions — one for newcomers and one for experienced users. You’ll see how recipes work, how to apply them across projects, and how to modernize code with confidence.

Join the next session, bring your questions, and learn how to automate the kind of work that usually eats your sprint time.

Course – LJB – NPI EA (cat = Core Java)
announcement - icon

Code your way through and build up a solid, practical foundation of Java:

>> Learn Java Basics

Partner – LambdaTest – NPI EA (cat= Testing)
announcement - icon

Distributed systems often come with complex challenges such as service-to-service communication, state management, asynchronous messaging, security, and more.

Dapr (Distributed Application Runtime) provides a set of APIs and building blocks to address these challenges, abstracting away infrastructure so we can focus on business logic.

In this tutorial, we'll focus on Dapr's pub/sub API for message brokering. Using its Spring Boot integration, we'll simplify the creation of a loosely coupled, portable, and easily testable pub/sub messaging system:

>> Flexible Pub/Sub Messaging With Spring Boot and Dapr

1. Introduction

A web crawler or spider is a program that searches and automatically indexes web content and other data on the web. Web crawlers scan webpages to understand every website page to retrieve, update, and index information when users perform search queries.

WebMagic is a simple, powerful, and scalable web crawler framework. It draws inspiration from Python’s popular framework, Scrapy. It handles HTTP requests, HTML parsing, task scheduling, and data pipeline processing with minimal boilerplate.

In this tutorial, we’ll explore WebMagic, its architecture, setup, and a basic Hello World example.

2. Architecture

WebMagic is built with a modular and extensible architecture. Let’s take a look at its core components:

webmagic-architecture

2.1. Spider

Spider is the main engine that orchestrates the entire crawling process. It takes the initial URL and invokes the downloader, processor, and pipeline.

2.2. Scheduler

The main job of the Scheduler is to manage the queue of URLs that need to be crawled. It also prevents duplicate crawling by keeping track of visited URLs. It sends one request at a time to the Downloader for further processing. We can also use in-memory, file-based, Redis-backed, or custom schedulers.

2.3. Downloader

Downloader is responsible for handling the actual HTTP requests. It’s responsible for downloading the HTML content from the internet. The default implementation of downloader uses Apache HttpClient, but we can customize it to use OkHttp or any other library. Once the page gets downloaded, it passes the downloaded page to PageProcessor.

2.4. PageProcessor

PageProcessor is also known as the heart of the crawler logic. As its name suggests, it defines how to extract the target data (like product, price, etc.) and new links to crawl from a page. We must implement the process method to parse the response and extract the required information.

Once extracted, the data is sent to the Pipeline, and the new links to crawl are sent back to the Scheduler.

2.5. Pipeline

Pipeline handles the post-processing of the extracted data. The most common operations are either saving the extracted data to a database or writing it to a file or console.

3. Setup With Maven

WebMagic uses Maven as its build tool, so it’s best to manage our project with Maven. Let’s take a look at the following dependencies in our pom.xml file:

<dependency>
    <groupId>us.codecraft</groupId>
    <artifactId>webmagic-core</artifactId>
    <version>1.0.3</version>
</dependency>
<dependency>
    <groupId>us.codecraft</groupId>
    <artifactId>webmagic-extension</artifactId>
    <version>1.0.3</version>
</dependency>

Also, WebMagic uses slf4j with slf4j-log4j12 implementation. We need to exclude slf4j-log4j12 from our implementation to avoid conflicts:

<exclusions>
    <exclusion>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-log4j12</artifactId>
    </exclusion>
</exclusions>

4. Hello World Example

Let’s look at an example where we’ll crawl the books.toscrape.com site and print the first 10 book titles and prices on the console.

public class BookScraper implements PageProcessor {
    private Site site = Site.me().setRetryTimes(3).setSleepTime(1000);

    @Override
    public void process(Page page) {
        var books = page.getHtml().css("article.product_pod");

        for (int i = 0; i < Math.min(10, books.nodes().size()); i++) {
            var book = books.nodes().get(i);

            String title = book.css("h3 a", "title").get();
            String price = book.css(".price_color", "text").get();

            System.out.println("Title: " + title + " | Price: " + price);
        }
    }

    @Override
    public Site getSite() {
        return site;
    }

    public static void main(String[] args) {
        Spider.create(new BookScraper())
          .addUrl("https://books.toscrape.com/")
          .thread(1)
          .run();
    }
}

In the above example, we defined a class BookScraper that implements PageProcessor. The functions process() and getSite() allow us to define how to scrape the page and crawler settings. The line below configures the crawler to retry failed requests up to three times and wait one second between requests to help avoid being blocked:

private Site site = Site.me().setRetryTimes(3).setSleepTime(1000);

The process() function contains the actual scraping logic. It selects all the HTML article elements from the books.toscrape.com site that have a CSS class of .product_pod. We go through each book and use CSS selectors to extract and print the title and price.

In the main function, we create a new WebMagic spider using our class. We start from the book’s homepage, run it with a single thread, and start crawling.

Let’s take a look at the output below of the program, where we can see the title and price of the 10 books:

17:02:26.460 [main] INFO us.codecraft.webmagic.Spider -- Spider books.toscrape.com started!
Title: A Light in the Attic | Price: £51.77
Title: Tipping the Velvet | Price: £53.74
Title: Soumission | Price: £50.10
Title: Sharp Objects | Price: £47.82
Title: Sapiens: A Brief History of Humankind | Price: £54.23
Title: The Requiem Red | Price: £22.65
Title: The Dirty Little Secrets of Getting Your Dream Job | Price: £33.34
Title: The Coming Woman: A Novel Based on the Life of the Infamous Feminist, Victoria Woodhull | Price: £17.93
Title: The Boys in the Boat: Nine Americans and Their Epic Quest for Gold at the 1936 Berlin Olympics | Price: £22.60
Title: The Black Maria | Price: £52.15
get page: https://books.toscrape.com/

5. Conclusion

In this tutorial, we looked into WebMagic, its architecture, and setup details. WebMagic offers a simple and powerful approach to building web crawlers in Java. Its design allows developers to focus on extracting data rather than writing boilerplate code for HTTP, parsing, and threading.

As seen in the example, with just a few lines of code, we created a working crawler and were able to extract book names and prices.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Partner – Moderne – NPI EA (tag=Refactoring)
announcement - icon

Modern Java teams move fast — but codebases don’t always keep up. Frameworks change, dependencies drift, and tech debt builds until it starts to drag on delivery. OpenRewrite was built to fix that: an open-source refactoring engine that automates repetitive code changes while keeping developer intent intact.

The monthly training series, led by the creators and maintainers of OpenRewrite at Moderne, walks through real-world migrations and modernization patterns. Whether you’re new to recipes or ready to write your own, you’ll learn practical ways to refactor safely and at scale.

If you’ve ever wished refactoring felt as natural — and as fast — as writing code, this is a good place to start.

eBook Jackson – NPI EA – 3 (cat = Jackson)