Integration tests with TestCointainers and Spring Boot 3.1+
In this article we are going to cover differences between unit and integration tests, and describe Testcontainers support in Spring Boot, covering features from the version 3.1.
Tests distribution models
Most of developers have probably seen “the testing pyramid” model showing the distribution of automated tests. The thing is, that this model was published around 2009, when there was no Docker, and integration tests were often slow, expensive and flaky. In 2018 Spotify proposed another strategy — Testing Honeycomb:
But let’s start with a question — what’s the difference between unit and integration tests?
Unit tests
Unit tests usually are focused on a single “subject under test”. This means, that a test should only have a single logical assertion (a single reason to fail).
Usually (at least in the Java world) a unit means a method, but it can also be a class, a module, or… a whole feature! There’s no single definition of what a unit is/should be.
Instead of using real dependencies, unit tests make use of “test doubles”, like mocks, spies, stubs, fakes…
Why unit testing only is not enough
While it’s relatively easy to write a lot of unit tests with satisfying code coverage, unit tests themselves might be not enough. These are the common issues with them:
- They often focus too much on the implementation, instead of on behaviour.
- They won’t verify if integration with databases, message brokers or external services really works.
- If not written correctly, they can slow down new features development, as every change or refactor would break tests.
Integration tests
Integration tests verify the integration between units/modules. Again, there is no single definition, as they can be more “narrow” and focus on testing only the integration part, or “broad” and look more like E2E tests. Whats more, in integration test we rather don’t use test doubles, but some replacements, acting more (or exactly) like a real dependency.
In essence, integration test relies on another system/dependency behaviour. Following the previously mentioned Spotify’s article:
What we should aim for instead is Integration Tests, which verify the correctness of our service in a more isolated fashion while focusing on the interaction points and making them very explicit.
Use case
Let’s assume, that you have a simple Spring Boot application, that exposes a REST API and uses PostgreSQL database as a storage. How to test a scenario, in which after a successfully handled HTTP request, something should be saved in the database?
Possible options:
- Using a test double, e.g. mocking. It would work, but what does it really test? We mock a repository and assume, that something will be saved in the database. The only thing we really test is that a specific, mocked method will be called. Not only the integration part is not being handled, but this test is now coupled to a specific method call. What if in the future a developer will decide to replace this repository method call with a simple SQL query? From application client’s perspective nothing changes, but this test is going to fail. Should integration test fail after changing the implementation detail?
- Using an in-memory database, like H2. This sounds like a better option, but it’s still not perfect. An application might use some database-specific features, or a database dialect might be different. We can maintain multiple database migrations, and have a separate set of migrations (in Flyway or Liquibase) used only for testing, but… is this worth it? On production it’s going to be different anyway, so having such tests does not give a feeling, that the code is safe to be deployed on production.
- Starting a database locally (ideally in Docker) for running integration tests on your laptop. That’s a good idea, but there’s some operational overhead, regarding starting a container manually, having a “clear state” before running tests again etc.
- Setting up a database on CI, dedicated for integration tests. Good, but not perfect, because on CI there might be issues with the database’s connection pool while multiple jobs run in parallel. There also might be other issues if tests use the same testing data, and some tables constraints might be violated. It can also be challenging to maintain the database schema, while having database migrations running from different branches.
Is there any better solution? Let’s finally take a look at the Testcontainers library!
Testcontainers
Testcontainers is an open source library for providing anything, that can run in a Docker container, e.g. databases, message brokers, web browsers. How it works?
https://testcontainers.com/getting-started/images/test-workflow.png
The library is responsible for starting and destroying containers used by integration test. This means, that integration tests can use the exactly same versions of databases/message brokers.etc. which are used on production! There are many ready to use modules to pick from. This library supports multiple languages and frameworks, in the JVM world you can pick between JUnit4, JUnit5 or Spock.
It also has some more advanced features, like waiting strategies to decide if a container is ready, network and volumes bindings, and many more.
Spring support
Spring Boot 3.1 has improved Testcontainers support (Spring.io blog post), which makes using it even easier! I’ve generated a Spring Boot project with the following dependencies (related to Testcontainers):
- org.springframework.boot:spring-boot-testcontainers
- org.testcontainers:junit-jupiter
- org.testcontainers:postgresql
and Spring Boot starter generated the following test configuration:
import org.springframework.boot.test.context.TestConfiguration;
import org.springframework.boot.testcontainers.service.connection.ServiceConnection;
import org.springframework.context.annotation.Bean;
import org.testcontainers.containers.KafkaContainer;
import org.testcontainers.containers.PostgreSQLContainer;
import org.testcontainers.utility.DockerImageName;
@TestConfiguration(proxyBeanMethods = false)
public class TestcontainersConfiguration { @Bean
@ServiceConnection
PostgreSQLContainer<?> postgresContainer() {
return new PostgreSQLContainer<>(DockerImageName.parse("postgres:17.0"));
}
}
Annotation @ServiceConnection discovers the type of container that is annotated and creates a ConnectionDetails for it. This replaces the need for the @DynamicPropertySource annotation and manually replacing an URL, username, password etc.
This configuration can be either used in a test (like this generated one):
import org.junit.jupiter.api.Test;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.context.annotation.Import;
@Import(TestcontainersConfiguration.class)
@SpringBootTest
class SpringBootTestcontainersApplicationTests { @Test
void contextLoads() {
}
}
Or… you can start an application from a main class in src/test directory :
import org.springframework.boot.SpringApplication;
public class TestSpringBootTestcontainersApplication { public static void main(String[] args) {
SpringApplication.from(SpringBootTestcontainersApplication::main)
.with(TestcontainersConfiguration.class)
.run(args);
}
}
If you want to quickly start an application, you no longer have to spin up its dependencies manually. Instead of providing dependencies as containers or writing a docker-compose file, you can start this “main” class from the src/test directory (take a look at “.with(TestcontainersConfiguration.class)” method call), and all required dependencies are going to run automatically. Maybe it’s not the desired long-term solution for development, but it’s a great solution for a quick application setup.
Example integration test
Based on the provided use case, let’s assume that the application’s domain is approvals, and successfully handled HTTP request saves an approval in the database. An example integration test could look as follows:
import com.fasterxml.jackson.databind.ObjectMapper;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.context.annotation.Import;
import org.springframework.http.HttpHeaders;
import org.springframework.http.MediaType;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.test.web.servlet.ResultActions;
import org.springframework.transaction.annotation.Transactional;
import org.testcontainers.junit.jupiter.Testcontainers;
import pl.akolata.testcontainers.TestcontainersConfiguration;
import pl.akolata.testcontainers.approval.api.schema.ApprovalDTO;
import pl.akolata.testcontainers.approval.domain.ApprovalsApplicationService;
import pl.akolata.testcontainers.approval.domain.model.Approval;
import java.util.Optional;import static org.hamcrest.Matchers.*;
import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post;
import static org.springframework.test.web.servlet.result.MockMvcResultHandlers.print;
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.*;@Transactional
@Testcontainers
@SpringBootTest
@Import(TestcontainersConfiguration.class)
@AutoConfigureMockMvc
public class ApprovalsControllerTestcontainersConfigImportTest { @Autowired
private MockMvc mvc; @Autowired
private ObjectMapper objectMapper; @Autowired
private ApprovalsApplicationService approvalsApplicationService; @Test
void createApproval_whenCreated_then201() throws Exception {
ResultActions resultActions = mvc.perform(post("/api/v1/approvals")
.contentType(MediaType.APPLICATION_JSON_VALUE)
.content("""
{
"description": "A test approval description"
}
""")
).andDo(print()); resultActions.andExpect(status().isCreated());
resultActions.andExpect(header().string(HttpHeaders.LOCATION, containsString("/api/v1/approvals"))); resultActions.andExpect(jsonPath("$.id", notNullValue()));
resultActions.andExpect(jsonPath("$.description", is("A test approval description")));
resultActions.andExpect(jsonPath("$.status", is("CREATED")));
resultActions.andExpect(jsonPath("$.createdAt", notNullValue()));
resultActions.andExpect(jsonPath("$.history[0].id", notNullValue()));
resultActions.andExpect(jsonPath("$.history[0].status", is("CREATED")));
resultActions.andExpect(jsonPath("$.history[0].statusAssignedAt", notNullValue())); String responseJson = resultActions.andReturn().getResponse().getContentAsString();
ApprovalDTO approvalDTO = objectMapper.readValue(responseJson, ApprovalDTO.class); Optional<Approval> approvalOptional = approvalsApplicationService.getApprovalById(approvalDTO.getId());
assertTrue(approvalOptional.isPresent()); Approval approval = approvalOptional.get(); // further assertions on the approval object read from the database
}
}
- Annotation @Testcontainers is responsible for containers lifecycle.
- Annotation @Import(TestcontainersConfiguration.class) imports PostgreSQLContainer from TestcontainersConfiguration class.
And that’s it! If you prefer to declare a container manually instead of making it a bean, you can always do that:
@Transactional
@Testcontainers
@SpringBootTest
@AutoConfigureMockMvc
public class ApprovalsControllerTestcontainersTest {
@Container
@ServiceConnection
private static final PostgreSQLContainer<?> POSTGRESQL_CONTAINER = new PostgreSQLContainer<>(DockerImageName.parse("postgres:17.0"))
.withDatabaseName("testcontainers")
.withDatabaseName("postgres")
.withPassword("postgres");
// tests}
Prior to Spring Boot 3.1, without @ServiceConnection annotation, you could use @DynamicPropertySource and override connection details manually:
@Transactional
@Testcontainers
@SpringBootTest
@AutoConfigureMockMvc
public class ApprovalsControllerTestcontainersDynamicPropertyTest {
@Container
private static final PostgreSQLContainer<?> POSTGRESQL_CONTAINER = new PostgreSQLContainer<>(DockerImageName.parse("postgres:17.0"))
.withDatabaseName("testcontainers")
.withDatabaseName("postgres")
.withPassword("postgres"); @DynamicPropertySource
static void dynamicProperties(DynamicPropertyRegistry registry) {
registry.add("spring.datasource.url", POSTGRESQL_CONTAINER::getJdbcUrl);
registry.add("spring.datasource.username", POSTGRESQL_CONTAINER::getUsername);
registry.add("spring.datasource.password", POSTGRESQL_CONTAINER::getPassword);
}// tests
}
“Advanced” usage
As it’s very easy to configure Testcontainers in a green field project, there might some aspects worth analyzing in advance:
- Containers lifecycle — containers can start for each test method, or for each test class, or once for all integration tests in the whole project. It’s good to read about those strategies and pick the best for your needs. In Spring Boot applications prior to version 3.1 it was common to use the Singleton Container Pattern, and have a base abstract class for all integration tests, or a custom application context initializer. Newer Spring Boot versions make it easier, but it’s still good to understand how many containers are going to start for your tests. I can recommend reading this great article from Maciej Walkowiak: https://maciejwalkowiak.com/blog/testcontainers-spring-boot-setup .
- CI platform — on some platforms you might need Docker in Docker in order to use Testcontainers, so it might be worth consulting with your infrastructure team.
- Upgrading dependencies on production — you can easily test if everything is going to work after upgrading a database/message broker/etc. version, by simply updating container’s version used by your tests.
- Custom images — it’s possible to use your custom images from a private images repository, so you don’t have to rely on Docker Hub.
- Test data setup — if you need some data for testing to be prepared in advance, you can do it once and save it in your custom image, or build a container on-the-fly using the library’s API. Piotr Przybył (https://softwaregarden.dev/pl/) has many great talks to find on YouTube.
Words by Aleksander Kołata, Senior Engineer in Altimetrik Poland