7/20/2025

Building a Modern Java Backend Service (Demo) with Spring Boot : A Hands-On Guide : Part 2

Building Confidence: A Deep Dive into Testing Strategies in the MyFintech Payment Service


In the fast-paced and highly regulated world of financial technology, the reliability and correctness of software are paramount. The MyFintech Payment Service project exemplifies a robust approach to quality assurance through a multi-layered testing strategy. This article explores how the project leverages Unit Testing, Integration Testing (including Spring Boot’s powerful testing utilities and Testcontainers), and a Liquibase-driven CI/CD pipeline for database schema changes to ensure a high degree of confidence in its codebase.



1. Unit Testing: Precision at the Core

Unit testing forms the foundational layer of the MyFintech Payment Service’s testing pyramid. The goal here is to isolate the smallest testable parts of an application—individual methods or classes—and verify their behavior independently of other components. This isolation is achieved primarily through mocking dependencies.

Benefits:

  • Speed: Unit tests execute very quickly, allowing for rapid feedback during development.

  • Isolation: Pinpoints defects precisely within a specific unit of code.

  • Refactoring Confidence: Provides a safety net when refactoring code, ensuring existing functionality isn’t broken.

Implementation in MyFintech Payment Service:

The project extensively uses JUnit 5 and Mockito for unit testing service layer components. For instance, ClientServiceImplTest.java demonstrates how the ClientService is tested in isolation, with its dependencies like ClientRepository and ClientValidator being mocked.

// src/test/java/org/myfintech/payment/service/ClientServiceImplTest.java
package org.myfintech.payment.service;

import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.when;

import java.time.OffsetDateTime;
import java.util.List;
import java.util.Optional;

import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mapstruct.factory.Mappers;
import org.mockito.InjectMocks;
import org.mockito.Mock;
import org.mockito.Spy;
import org.mockito.junit.jupiter.MockitoExtension;
import org.myfintech.payment.domain.ClientCreateDTO;
import org.myfintech.payment.domain.ClientDTO;
import org.myfintech.payment.entity.Client;
import org.myfintech.payment.mapper.ClientMapper;
import org.myfintech.payment.repository.ClientRepository;
import org.myfintech.payment.service.impl.ClientServiceImpl;
import org.myfintech.payment.validator.ClientValidator;

@ExtendWith(MockitoExtension.class)
public class ClientServiceImplTest {

    @Mock
    private ClientRepository clientRepository;
   
    @Mock
    private ClientValidator restValidator;

    @Spy
    private ClientMapper clientMapper =  Mappers.getMapper( ClientMapper.class );

    @InjectMocks
    private ClientServiceImpl clientService;

    private Client client;
    private ClientDTO clientDTO;
    private ClientCreateDTO clientCreateDTO;

    @BeforeEach
    void setUp() {
    OffsetDateTime now = OffsetDateTime.now();
        client = new Client(1L,now,now, "Acme");
        clientDTO = new ClientDTO(1L, "Acme");
        clientCreateDTO = new ClientCreateDTO("Acme");
    }

    @Test
    void shouldReturnAllClients() {
        when(clientRepository.findAll()).thenReturn(List.of(client));
        List<ClientDTO> result = clientService.findAll();
        assertEquals(1, result.size());
        assertEquals("Acme", result.get(0).clientName());
    }

    @Test
    void shouldReturnClientById() {
        when(clientRepository.findById(1L)).thenReturn(Optional.of(client));
        ClientDTO result = clientService.findById(1L);
        assertEquals("Acme", result.clientName());
    }

    @Test
    void shouldSaveClient() {
        when(clientRepository.save(any())).thenReturn(client);
        ClientDTO result = clientService.save(clientCreateDTO);
        assertEquals("Acme", result.clientName());
    }

    @Test
    void shouldUpdateClient() {
        when(clientRepository.findById(1L)).thenReturn(Optional.of(client));
        ClientDTO result = clientService.update(1L, clientDTO);
        assertEquals("Acme", result.clientName());
    }
}




2. Integration Testing: Verifying Component Interactions

Integration tests verify that different modules or services work together as expected. In a Spring Boot application, this often means testing the interaction between controllers, services, and repositories.

Benefits:

  • Component Contract Verification: Ensures that components adhere to their defined interfaces and interact correctly.

  • Early Detection of Integration Bugs: Catches issues that might not be apparent in isolated unit tests.

  • Closer to Real-World Scenarios: Provides higher confidence as it tests a larger slice of the application.

Spring Boot’s @SpringBootTest and MockMvc:

@SpringBootTest loads the entire Spring application context, making it suitable for end-to-end integration tests of an API layer. MockMvc allows simulating HTTP requests without starting a full HTTP server, making these tests faster than true end-to-end tests.

//src/test/java/org/myfintech/payment/integration/ClientControllerIntegrationTest.java
package org.myfintech.payment.integration;

import com.fasterxml.jackson.databind.ObjectMapper;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.myfintech.payment.PaymentApplication;
import org.myfintech.payment.domain.ClientCreateDTO;
import org.myfintech.payment.domain.ClientDTO;
import org.myfintech.payment.repository.ClientRepository;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.http.MediaType;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.transaction.annotation.Transactional;

import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.*;
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.*;
import static org.hamcrest.Matchers.*;

@SpringBootTest(classes = PaymentApplication.class, webEnvironment = SpringBootTest.WebEnvironment.MOCK)
@AutoConfigureMockMvc
@DirtiesContext(classMode = DirtiesContext.ClassMode.AFTER_EACH_TEST_METHOD) // Ensures clean state between tests
public class ClientControllerIntegrationTest {

    @Autowired
    private MockMvc mockMvc;

    @Autowired
    private ObjectMapper objectMapper; // For converting objects to JSON

    @Autowired
    private ClientRepository clientRepository; // To verify direct DB state if needed

    @BeforeEach
    @Transactional // Ensure setup runs in a transaction and rolls back
    void setUp() {
        // Clear data before each test to ensure test isolation
        clientRepository.deleteAll();
    }

    @Test
    @Transactional // Ensures the test runs in a transaction and rolls back
    void getAllClients_shouldReturnEmptyList_whenNoClientsExist() throws Exception {
        mockMvc.perform(get("/api/v1/clients")
                .contentType(MediaType.APPLICATION_JSON))
                .andExpect(status().isOk())
                .andExpect(jsonPath("$").isEmpty());
    }

    @Test
    @Transactional
    void getAllClients_shouldReturnClients() throws Exception {
        // Arrange: Create a client directly via repository (simulating persistence)
        ClientCreateDTO newClient1 = new ClientCreateDTO("Client A");
        ClientDTO savedClient1 = createClientViaPost(newClient1);

        mockMvc.perform(get("/api/v1/clients")
                .contentType(MediaType.APPLICATION_JSON))
                .andExpect(status().isOk())
                .andExpect(jsonPath("$", hasSize(1)))
                .andExpect(jsonPath("$[0].clientName").value("Client A"));
    }
    // ... (other tests as provided previously) ...

    // Helper method to create a client via POST request
    private ClientDTO createClientViaPost(ClientCreateDTO clientCreateDTO) throws Exception {
        String responseContent = mockMvc.perform(post("/api/v1/clients")
                .contentType(MediaType.APPLICATION_JSON)
                .content(objectMapper.writeValueAsString(clientCreateDTO)))
                .andExpect(status().isCreated())
                .andReturn().getResponse().getContentAsString();
        return objectMapper.readValue(responseContent, ClientDTO.class);
    }
}

}


3. Testcontainers: Bridging the Gap to Reality

A common challenge with integration tests is ensuring that the test environment closely mirrors production. Using in-memory databases can sometimes mask subtle differences in SQL dialects or database-specific behaviors. This is where Testcontainers shine.

What are Testcontainers?

Testcontainers is a Java library that provides lightweight, disposable instances of databases, message brokers, web browsers, or anything else that can run in a Docker container. They are spun up programmatically for your tests and torn down afterward, ensuring a clean state for each test run.

Benefits:

  • High Fidelity: Tests against real database instances (e.g., PostgreSQL, MySQL), not in-memory approximations.

  • Environment Parity: Ensures that tests run in an environment that is as close to production as possible.

  • No Local Setup: Developers don’t need to install and configure databases locally; Docker handles it.

  • Test Isolation: Each test can get its own fresh container, preventing side effects between tests.

Implementation in MyFintech Payment Service:

The MyFintech Payment Service utilizes Testcontainers for its deeper integration tests. The AbstractTestcontainersIntegrationTest class serves as a base for tests that require a real PostgreSQL database.

// src/test/java/org/myfintech/payment/integration/testcontainers/AbstractTestcontainersIntegrationTest.java
package org.myfintech.payment.integration.testcontainers;

import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.DynamicPropertyRegistry;
import org.springframework.test.context.DynamicPropertySource;
import org.testcontainers.containers.PostgreSQLContainer;
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;

@SpringBootTest
@Testcontainers // Enables Testcontainers support for JUnit 5
public abstract class AbstractTestcontainersIntegrationTest {

    // Define a PostgreSQL container. It will be started once for all tests in this class hierarchy.
    @Container
    public static PostgreSQLContainer<?> postgresContainer = new PostgreSQLContainer<>("postgres:16.3")
            .withDatabaseName("testdb")
            .withUsername("test")
            .withPassword("test");

    // Dynamically set Spring Boot properties to connect to the Testcontainers database
    @DynamicPropertySource
    static void setDatasourceProperties(DynamicPropertyRegistry registry) {
        registry.add("spring.datasource.url", postgresContainer::getJdbcUrl);
        registry.add("spring.datasource.username", postgresContainer::getUsername);
        registry.add("spring.datasource.password", postgresContainer::getPassword);
        registry.add("spring.jpa.hibernate.ddl-auto", () -> "create-drop"); // Ensure schema is created/dropped for tests
        registry.add("spring.test.database.replace", () -> "none"); // Important: tell Spring not to replace the datasource
    }
}

Tests like ClientServiceIntegrationTest.java then extend this base class to run their integration tests against a live PostgreSQL instance:

// src/test/java/org/myfintech/payment/integration/testcontainers/ClientServiceIntegrationTest.java
package org.myfintech.payment.integration.testcontainers;

import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.myfintech.payment.domain.ClientCreateDTO;
import org.myfintech.payment.domain.ClientDTO;
import org.myfintech.payment.repository.ClientRepository;
import org.myfintech.payment.service.ClientService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.transaction.annotation.Transactional;

import java.util.List;

import static org.junit.jupiter.api.Assertions.*;

public class ClientServiceIntegrationTest extends AbstractTestcontainersIntegrationTest {

    @Autowired
    private ClientService clientService;

    @Autowired
    private ClientRepository clientRepository; // For direct DB assertions

    @BeforeEach
    @Transactional // Each test method runs in its own transaction, which is rolled back
    void setUp() {
        // Clean up data before each test for isolation
        clientRepository.deleteAll();
    }

    @Test
    @Transactional
    void findAll_shouldReturnAllClientsFromRealDb() {
        // Arrange
        clientService.save(new ClientCreateDTO("Client A"));
        clientService.save(new ClientCreateDTO("Client B"));

        // Act
        List<ClientDTO> clients = clientService.findAll();

        // Assert
        assertNotNull(clients);
        assertEquals(2, clients.size());
        assertTrue(clients.stream().anyMatch(c -> c.clientName().equals("Client A")));
        assertTrue(clients.stream().anyMatch(c -> c.clientName().equals("Client B")));
    }

    @Test
    @Transactional
    void save_shouldPersistClientToRealDb() {
        // Arrange
        ClientCreateDTO newClient = new ClientCreateDTO("New Real Client");

        // Act
        ClientDTO savedClient = clientService.save(newClient);

        // Assert
        assertNotNull(savedClient.clientId());
        assertEquals("New Real Client", savedClient.clientName());

        // Verify directly from the database
        assertTrue(clientRepository.findById(savedClient.clientId()).isPresent());
        assertEquals("New Real Client", clientRepository.findById(savedClient.clientId()).get().getClientName());
    }
}


As a side note, I have avoided ( for the time being ) the above code pattern (AbstractTestcontainersIntegrationTest), because I faced the below error due to that. 

You can play more with that and share your finding with us 🙂 


2025-07-20T15:17:11.054+02:00 WARN 1713279 --- [ionShutdownHook] com.zaxxer.hikari.pool.PoolBase : HikariPool-1 - Failed to validate connection org.postgresql.jdbc.PgConnection@3d900b9c (This connection has been closed.). Possibly consider using a shorter maxLifetime value. [ERROR] Surefire is going to kill self fork JVM. The exit has elapsed 30 seconds after System.exit(0).



4. CI/CD for Database Schema Changes with Liquibase and GitHub Actions


Managing database schema changes is a critical aspect of application development. Liquibase provides a version control system for your database, allowing you to track, manage, and apply changes in a controlled manner. Integrating this with GitHub Actions enables automated and reliable deployments of schema updates.

What is Liquibase?

Liquibase is an open-source, database-independent library for tracking, managing, and applying database schema changes. It uses changelog files (XML, YAML, JSON, or SQL) to define changesets, which are unique, versioned units of work.


Benefits:

  • Version Control for Database: Treat your database schema like source code.

  • Automated Deployments: Apply changes reliably across environments.

  • Rollback Capabilities: Easily revert to previous schema states.

  • Database Agnostic: Supports various databases.



Test locally with docker-compose


  1. First we have to boot up the postgresql db in a docker container.


$ docker-compose up -d db



  1. Then we can check the logs of the docker container.


$ docker-compose logs -f db


  1. Now we can test our db changes by running below shell script which start a Liquibase container locally.


$ sh liquibase-test.sh






  1. When we run the same script for the second time you can see, it says changes already applied.


$ sh liquibase-test.sh



  1. Shutdown the docker containers.


$ docker-compose down




Implementation in MyFintech Payment Service:


The project uses Liquibase with SQL-based changelogs and automates its execution via a GitHub Actions workflow.

Liquibase Changelog Structure:

The db.changelog-master.yaml acts as the main entry point, including individual SQL files for each schema change.

# src/main/resources/db/changelog/db.changelog-master.yaml
databaseChangeLog:
  - include:
      file: db/changelog/changes/001-create-tables.sql
      relativeToChangelogFile: true
  - include:
      file: db/changelog/changes/002-insert-test-data.sql # Example for data insertion
      relativeToChangelogFile: true


The initial schema is defined in 001-create-tables.sql:

-- src/main/resources/db/changelog/changes/001-create-tables.sql
-- liquibase changeset author:dhanuka ranasinghe id:1
-- Client table
CREATE TABLE client (
    id BIGINT PRIMARY KEY,
    client_name VARCHAR(255) NOT NULL,
    created_datetime TIMESTAMPTZ NOT NULL,
    updated_datetime TIMESTAMPTZ NOT NULL
);

-- liquibase changeset author:dhanuka ranasinghe id:2
-- Contract table
CREATE TABLE contract (
    id BIGINT PRIMARY KEY,
    client_id BIGINT NOT NULL,
    contract_number VARCHAR(100) UNIQUE NOT NULL,
    created_datetime TIMESTAMPTZ NOT NULL,
    updated_datetime TIMESTAMPTZ NOT NULL,
    CONSTRAINT fk_contract_client FOREIGN KEY (client_id) REFERENCES client(id) ON DELETE CASCADE
);

-- liquibase changeset author:dhanuka ranasinghe id:3
-- Payment Tracking table
CREATE TABLE payment_tracking (
    id BIGINT PRIMARY KEY,
    tracking_number VARCHAR(100) UNIQUE NOT NULL,
    created_datetime TIMESTAMPTZ NOT NULL,
    updated_datetime TIMESTAMPTZ NOT NULL
);

-- liquibase changeset author:dhanuka ranasinghe id:4
-- Payment table
CREATE TABLE payment (
    id BIGINT PRIMARY KEY,
    payment_date DATE NOT NULL,
    amount DECIMAL(10, 2) NOT NULL,
    type VARCHAR(20) CHECK (type IN ('CREDIT', 'DEBIT', 'TRANSFER','REFUND')) NOT NULL,
    contract_id BIGINT NOT NULL,
    version INTEGER NOT NULL DEFAULT 0,
    created_datetime TIMESTAMPTZ NOT NULL,
    updated_datetime TIMESTAMPTZ NOT NULL,
    tracking_id BIGINT NOT NULL DEFAULT -1,
    CONSTRAINT fk_payment_contract FOREIGN KEY (contract_id) REFERENCES contract(id) ON DELETE CASCADE,
    CONSTRAINT fk_batch_payment_track FOREIGN KEY (tracking_id) REFERENCES payment_tracking(id) ON DELETE CASCADE
);




5. GitHub Actions Workflow for Liquibase

The liquibase.yaml workflow automates the application of these schema changes. It spins up a temporary PostgreSQL database using GitHub Actions services and then executes the liquibase:update Maven goal.

YAML



# .github/workflows/liquibase.yaml

name: Deploy SQL Changes

on:
  push:
    branches:
      - main # Trigger on pushes to the main branch

jobs:
  deploy-schema:
    runs-on: ubuntu-latest
   
    # Define a PostgreSQL service for the workflow to deploy to
    services:
      postgres:
        image: postgres:16.3 # Use a specific version for consistency
        env:
          POSTGRES_DB: ${{ secrets.DB_NAME }}
          POSTGRES_USER: ${{ secrets.DB_USER }}
          POSTGRES_PASSWORD: ${{ secrets.DB_PASSWORD }}
        ports:
          - 5432:5432 # Map container port 5432 to host port 5432 (accessible as localhost:5432 from the runner)
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Set up JDK 21
        uses: actions/setup-java@v4
        with:
          java-version: '21'
          distribution: 'temurin'
          cache: 'maven' # Cache Maven dependencies to speed up builds

      - name: Wait for PostgreSQL to be ready
        run: |
          echo "Waiting for PostgreSQL to start..."
          for i in $(seq 1 10); do
            nc -z localhost 5432 && echo "PostgreSQL is ready!" && exit 0
            echo "Attempt $i/10: PostgreSQL not yet ready. Waiting 5 seconds..."
            sleep 5
          done
          echo "PostgreSQL did not start in time. Aborting."
          exit 1

      - name: Deploy SQL changes with Liquibase
        run: mvn liquibase:update
        env:
          liquibase.url: jdbc:postgresql://localhost:5432/${{ secrets.DB_NAME }}
          liquibase.username: ${{ secrets.DB_USER }}
          liquibase.password: ${{ secrets.DB_PASSWORD }}



Note on GitHub Secrets: For this workflow to function, you must configure DB_NAME, DB_USER, and DB_PASSWORD as secrets in your GitHub repository settings.






6. Code Coverage with JaCoCo in CI



Measuring code coverage is an essential practice to understand the effectiveness of your test suite. It helps identify untested parts of your codebase, guiding further test development efforts. The MyFintech Payment Service integrates JaCoCo (Java Code Coverage) into its CI pipeline to automatically generate coverage reports.

What is JaCoCo? JaCoCo is a free Java code coverage library distributed under the Eclipse Public License. It provides code coverage metrics for Java applications, including line, branch, method, and class coverage.

Benefits:

  • Identifies Untested Code: Highlights areas of the codebase that are not exercised by tests.

  • Quality Gate: Can be configured to enforce minimum coverage thresholds, preventing code with insufficient tests from being merged.

  • Trend Analysis: Tracking coverage over time helps monitor the health of the test suite.

Implementation in MyFintech Payment Service's CI Pipeline: The integration of JaCoCo is handled through the jacoco-maven-plugin configured in the project's pom.xml. The ci.yml GitHub Actions workflow triggers this process.

In the pom.xml, the jacoco-maven-plugin is set up with two key execution goals:

  • prepare-agent: This goal runs early in the Maven build lifecycle (during the initialize phase). It instrumented the compiled Java classes by injecting code that tracks execution paths. This instrumentation is crucial for collecting coverage data during test execution.

  • report: This goal is bound to the verify phase of the Maven lifecycle. After all unit and integration tests have completed (which typically run during the test phase), the report goal processes the data collected by the JaCoCo agent and generates the comprehensive code coverage report.

The ci.yml GitHub Actions workflow includes a step that executes the mvn clean verify command:

YAML



# .github/workflows/ci.yml
name: CI Pipeline

on:
  push:
    branches: [ main ]
  pull_request:
    branches: [ main ]

jobs:
  build-and-test:
    runs-on: ubuntu-latest
    services:
      postgres:
        image: postgres:15
        env:
          POSTGRES_DB: myfintechdb
          POSTGRES_USER: myfintech
          POSTGRES_PASSWORD: myfintechpass
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    env:
      SPRING_DATASOURCE_URL: jdbc:postgresql://localhost:5432/myfintechdb
      SPRING_DATASOURCE_USERNAME: myfintech
      SPRING_DATASOURCE_PASSWORD: myfintechpass

    steps:
      - uses: actions/checkout@v4

      - name: Set up JDK 21
        uses: actions/setup-java@v4
        with:
          java-version: '21'
          distribution: 'temurin'

      - name: Cache Maven packages
        uses: actions/cache@v4
        with:
          path: ~/.m2
          key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
          restore-keys: |
            ${{ runner.os }}-maven

      - name: Build and test
        run: mvn clean verify # This command triggers JaCoCo's prepare-agent and report goals


When mvn clean verify is executed in the CI pipeline:

  1. JaCoCo's agent prepares the code for coverage data collection.

  2. All unit tests and integration tests (including those using Testcontainers) are run.

  3. The JaCoCo agent records execution data.

  4. Finally, the JaCoCo report is generated, typically as an HTML file set in target/site/jacoco/index.html. This report can then be published as a build artifact in GitHub Actions, allowing developers to review coverage results directly from the CI run.



By integrating JaCoCo, the MyFintech Payment Service ensures continuous monitoring of test coverage, fostering a culture of thorough testing and contributing to the overall quality and maintainability of the codebase.


   

 

No comments:

Post a Comment