Spring Data MongoDB provides a variety of ways to work with a MongoDB database: low-level MongoReader
and MongoWriter
APIs, and higher-level MongoTemplate
and MongoOperations
APIs that make use of Query, Criteria and Update DSLs. It also provides a repository-style programming model through the MongoRepository
interface which adds convenient abstractions to work with MongoDB.
In this post, we’ll explore how to persist documents with MongoRepository
, create custom converters for specific data types and cascade the documents.
Lombok is used to generate boilerplate code (e.g., getters, setters, builders, etc.) by using annotations. You can learn more about it here.
You’d need a mongoDB instance to persist the data. You can install MongoDB Community Server from here or get a free trial instance at MongoDB Atlas. You can also launch MongoDB as a Docker container. Create a docker-compose.yml
file somewhere and add the following details in it.
version: '3'
services:
mongo:
image: mongo:latest
container_name: mongodb_latest
ports:
27017:27017
- environment:
MONGO_INITDB_ROOT_USERNAME: gwen
MONGO_INITDB_ROOT_PASSWORD: stacy
Open the terminal at the location of docker-compose.yml
and execute the following command to launch the container.
docker-compose up -d
Define a domain
Let’s start by defining a domain for the above story. The relationship between the Account
, User
and Session
collections can be represented by the following diagram.
A Many-to-One relationship in MongoDB can be modeled with either embedded documents or document references. You can add the latter behavior through a @DBRef
annotation.
Define an Account
class as follows -
// src/main/java/dev/mflash/guides/mongo/domain/Account.java
@Data @Builder
public class Account {
private final @Id @Default String key = UUID.randomUUID().toString();
private @DBRef User user;
private @DBRef @Singular Set<Session> sessions;
private ZonedDateTime created;
}
Similarly, define the User
// src/main/java/dev/mflash/guides/mongo/domain/User.java
@Data @Builder
public class User {
private final @Id @Default String key = UUID.randomUUID().toString();
private String name;
private String email;
private Locale locale;
private LocalDate dateOfBirth;
}
and Session
classes.
// src/main/java/dev/mflash/guides/mongo/domain/Session.java
@Data @Builder
public class Session {
private final @Id @Default String key = UUID.randomUUID().toString();
private String city;
private Locale locale;
private LocalDateTime accessed;
}
Note that we’re initializing the key
with a random UUID. We’ll discuss why this is needed in the cascading section.
Create a Repository
Define a repository for the Account
by extending the MongoRepository
interface.
// src/main/java/dev/mflash/guides/mongo/repository/AccountRepository.java
public interface AccountRepository extends MongoRepository<Account, String> {
Account findDistinctFirstByUser(User user);
List<Account> findBySessions(Session session);
}
MongoRepository
extends CrudRepository
interface and thereby, provides several CRUD methods (e.g., findAll()
, save()
, etc.) out-of-the-box. For specific queries, you can declare query methods (using the naming conventions described in the docs). Spring will generate their implementations at runtime.
Testing the AccountRepository
Let’s write some tests to check the functionality of the AccountRepository
. We’ll use assertion methods provided by AssertJ, a popular assertion library that comes bundled with Spring.
// src/test/java/dev/mflash/guides/mongo/repository/AccountRepositoryTest.java
@ExtendWith(SpringExtension.class)
@SpringBootTest class AccountRepositoryTest {
private static final List<User> SAMPLE_USERS = List.of(
User.builder().name("Tina Lawrence").email("tina@example.com").locale(Locale.CANADA).dateOfBirth(
LocalDate.of(1989, Month.JANUARY, 11)).build(),
User.builder().name("Adrian Chase").email("adrian@example.com").locale(Locale.UK).dateOfBirth(
LocalDate.of(1994, Month.APRIL, 23)).build(),
User.builder().name("Mohd Ali").email("mohdali@example.com").locale(Locale.JAPAN).dateOfBirth(
LocalDate.of(1999, Month.OCTOBER, 9)).build()
);
private static final List<Session> SAMPLE_SESSIONS = List.of(
Session.builder().city("Toronto").locale(Locale.CANADA).build(),
Session.builder().city("Los Angeles").locale(Locale.US).build(),
Session.builder().city("London").locale(Locale.UK).build(),
Session.builder().city("Paris").locale(Locale.FRANCE).build(),
Session.builder().city("Tokyo").locale(Locale.JAPAN).build()
);
private static final List<Account> SAMPLE_ACCOUNTS = List.of(
Account.builder().user(SAMPLE_USERS.get(0)).session(SAMPLE_SESSIONS.get(0)).session(SAMPLE_SESSIONS.get(1))
ZonedDateTime.now()).build(),
.created(Account.builder().user(SAMPLE_USERS.get(1)).session(SAMPLE_SESSIONS.get(1)).session(SAMPLE_SESSIONS.get(2))
ZonedDateTime.now()).build(),
.created(Account.builder().user(SAMPLE_USERS.get(2)).session(SAMPLE_SESSIONS.get(4)).session(SAMPLE_SESSIONS.get(3))
ZonedDateTime.now()).build()
.created(
);
private @Autowired AccountRepository repository;
@BeforeEach
void setUp() {
.deleteAll();
repository.saveAll(SAMPLE_ACCOUNTS);
repository
}
@Test
@DisplayName("Should find some accounts")
void shouldFindSomeAccounts() {
.count()).isEqualTo(SAMPLE_ACCOUNTS.size());
assertThat(repository
}
@Test
@DisplayName("Should assign a key on save")
void shouldAssignAKeyOnSave() {
.findAll()).extracting("key").isNotNull();
assertThat(repository
}
@Test
@DisplayName("Should get a distinct user by first name")
void shouldGetADistinctUserByFirstName() {
.findDistinctFirstByUser(SAMPLE_USERS.get(0)).getUser())
assertThat(repositorySAMPLE_USERS.get(0), "key");
.isEqualToIgnoringGivenFields(
}
@Test
@DisplayName("Should find some users with a given session")
void shouldFindSomeUsersWithAGivenSession() {
.findBySessions(SAMPLE_SESSIONS.get(1))).isNotEmpty();
assertThat(repository
} }
In the above test, we begin by creating some test data (SAMPLE_USERS
, SAMPLE_SESSIONS
and SAMPLE_ACCOUNTS
). Using the test data we test several functionalities of the repository.
When you’ll run these tests, the following exception may be thrown:
org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.time.ZonedDateTime
This happens because Account
has a field created
of type ZonedDateTime
which can’t be converted to a valid MongoDB representation by the available Spring converters. You’ll have to tell Spring how to do this conversion by defining a custom converter.
Converters for ZonedDateTime
Spring provides a Converter
interface that you can implement for this purpose. We need two converters here: one to convert ZonedDateTime
to Date
and the other to convert Date
to ZonedDateTime
.
// src/main/java/dev/mflash/guides/mongo/configuration/ZonedDateTimeConverters.java
public class ZonedDateTimeConverters {
public static List<Converter<?, ?>> getConvertersToRegister() {
return List.of(ZonedDateTimeToDateConverter.INSTANCE, DateToZonedDateTimeConverter.INSTANCE);
}
private enum ZonedDateTimeToDateConverter implements Converter<ZonedDateTime, Date> {
INSTANCE;
public @Override Date convert(ZonedDateTime source) {
return Date.from(source.toInstant());
}
}
private enum DateToZonedDateTimeConverter implements Converter<Date, ZonedDateTime> {
INSTANCE;
public @Override ZonedDateTime convert(Date source) {
return source.toInstant().atZone(ZoneOffset.UTC);
}
} }
In the above ZonedDateTimeConverters
implementation, we first define the ZonedDateTimeToDateConverter
and DateToZonedDateTimeConverter
converters by extending the Converter
interface. Finally, we return a list of these converters through getConvertersToRegister
method.
Also note that we’ve defined UTC as the
ZoneOffset
here. MongoDB stores times in UTC by default. You will have to adjust the offset if you need to store times in a custom timezone.
Inject these converters through a MongoCustomConversions
bean as follows:
// src/main/java/dev/mflash/guides/mongo/configuration/MongoConfiguration.java
@EnableMongoRepositories(MongoConfiguration.REPOSITORY_PACKAGE)
public @Configuration class MongoConfiguration {
static final String REPOSITORY_PACKAGE = "dev.mflash.guides.mongo.repository";
public @Bean MongoCustomConversions customConversions() {
return new MongoCustomConversions(ZonedDateTimeConverters.getConvertersToRegister());
} }
You’ll be able to run the tests successfully now.
Cascade the document operations
Spring Data MongoDB provides the support for lifecycle events through the MongoMappingEvent
class. You can use this to write an event listener that can perform cascading operations for you.
Define a @Cascade
annotation
Let’s start by defining an annotation to indicate that a field should be cascaded.
// src/main/java/dev/mflash/guides/mongo/event/Cascade.java
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.FIELD)
public @interface Cascade {
CascadeType value() default CascadeType.ALL;
}
The CascadeType
is an enum
that denotes different types of cascading supported by our implementation.
// src/main/java/dev/mflash/guides/mongo/event/CascadeType.java
public enum CascadeType {
ALL, SAVE, DELETE
}
With this, we can pass a CascadeType
value to the @Cascade
annotation and control the type of cascading we may want. By default, both save and delete operations will be cascaded.
Annotate the desired fields with this annotation.
// src/main/java/dev/mflash/guides/mongo/domain/Account.java
@Data @Builder
public class Account {
// Other properties
private @DBRef @Cascade User user;
private @DBRef @Cascade @Singular Set<Session> sessions;
// Other properties
}
Detect the fields to be cascaded
The references of cascaded objects should be associated with a document by checking if such a document exists. You can do this by checking the @Id
of the document through a FieldCallback
.
// src/main/java/dev/mflash/guides/mongo/event/IdentifierCallback.java
public class IdentifierCallback implements FieldCallback {
private boolean idFound;
public @Override void doWith(final Field field) throws IllegalArgumentException {
ReflectionUtils.makeAccessible(field);
if (field.isAnnotationPresent(Id.class)) {
= true;
idFound
}
}
public boolean isIdFound() {
return idFound;
} }
Since a valid non-null @Id
must be present for this to properly work, we need to initialize the key as early as possible. That’s why we are initializing the key
field of every document with a random UUID.
To detect the fields to be cascaded, we need to check which of them have been annotated with the @Cascade
annotation. For a save cascade, define an implementation of the FieldCallback
which performs this check and applies a save
operation using a MongoOperations
bean.
// src/main/java/dev/mflash/guides/mongo/event/CascadeSaveCallback.java
@RequiredArgsConstructor
public class CascadeSaveCallback implements FieldCallback {
private final Object source;
private final MongoOperations mongoOperations;
public @Override void doWith(final Field field) throws IllegalArgumentException, IllegalAccessException {
ReflectionUtils.makeAccessible(field);
if (field.isAnnotationPresent(DBRef.class) && field.isAnnotationPresent(Cascade.class)) {
final Object fieldValue = field.get(source);
if (Objects.nonNull(fieldValue)) {
final var callback = new IdentifierCallback();
final CascadeType cascadeType = field.getAnnotation(Cascade.class).value();
if (cascadeType.equals(CascadeType.SAVE) || cascadeType.equals(CascadeType.ALL)) {
if (fieldValue instanceof Collection<?>) {
Collection<?>) fieldValue).forEach(mongoOperations::save);
((else {
} ReflectionUtils.doWithFields(fieldValue.getClass(), callback);
.save(fieldValue);
mongoOperations
}
}
}
}
} }
Similarly, implement a CascadeDeleteCallback
that checks the presence of the @Id
and @Cascade
annotations and applies the remove
operation.
// src/main/java/dev/mflash/guides/mongo/event/CascadeDeleteCallback.java
@RequiredArgsConstructor
public class CascadeDeleteCallback implements FieldCallback {
private final Object source;
private final MongoOperations mongoOperations;
public @Override void doWith(final Field field) throws IllegalArgumentException, IllegalAccessException {
ReflectionUtils.makeAccessible(field);
if (field.isAnnotationPresent(DBRef.class) && field.isAnnotationPresent(Cascade.class)) {
final Object fieldValue = field.get(source);
if (Objects.nonNull(fieldValue)) {
final var callback = new IdentifierCallback();
final CascadeType cascadeType = field.getAnnotation(Cascade.class).value();
if (cascadeType.equals(CascadeType.DELETE) || cascadeType.equals(CascadeType.ALL)) {
if (fieldValue instanceof Collection<?>) {
Collection<?>) fieldValue).forEach(mongoOperations::remove);
((else {
} ReflectionUtils.doWithFields(fieldValue.getClass(), callback);
.remove(fieldValue);
mongoOperations
}
}
}
}
} }
These callbacks won’t be invoked automatically; you’d need a listener to invoke them.
Invoking the cascade automatically
The AbstractMongoEventListener
class provides various callback methods that get invoked during the persistence operations. As mentioned in the docs,
- the
onBeforeSave
callback method is called before inserting or saving a document in the database; this method captures theBeforeSaveEvent
containing the document being saved. - the
onBeforeDelete
callback method is called before a document is deleted; this method captures theBeforeDeleteEvent
containing the document about to be deleted. - the
onAfterDelete
callback method is called after a document or a set of documents have been deleted; this method captures theAfterDeleteEvent
containing the document(s) that has/have been deleted. The references of the documents in theAfterDeleteEvent
merely contain the values ofid
and not other fields since they’ve already been deleted.
Also note that the lifecycle events are emitted only for the parent types. These events won’t be emitted for any children until and unless they’re annotated with the @DBRef
annotation.
Let’s use these callback methods to execute the cascade callbacks implemented earlier. Create a AccountCascadeMongoEventListener
class that extends AbstractMongoEventListener
for the Account
class.
// src/main/java/dev/mflash/guides/mongo/event/AccountCascadeMongoEventListener.java
public class AccountCascadeMongoEventListener extends AbstractMongoEventListener<Account> {
private @Autowired MongoOperations mongoOperations;
private Account deletedAccount;
public @Override void onBeforeSave(BeforeSaveEvent<Account> event) {
final Object source = event.getSource();
ReflectionUtils.doWithFields(source.getClass(), new CascadeSaveCallback(source, mongoOperations));
}
public @Override void onBeforeDelete(BeforeDeleteEvent<Account> event) {
final Object id = Objects.requireNonNull(event.getDocument()).get("_id");
= mongoOperations.findById(id, Account.class);
deletedAccount
}
public @Override void onAfterDelete(AfterDeleteEvent<Account> event) {
ReflectionUtils.doWithFields(Account.class, new CascadeDeleteCallback(deletedAccount, mongoOperations));
} }
and inject it as a bean using MongoConfiguration
.
// src/main/java/dev/mflash/guides/mongo/configuration/MongoConfiguration.java
@EnableMongoRepositories(MongoConfiguration.REPOSITORY_PACKAGE)
public @Configuration class MongoConfiguration {
static final String REPOSITORY_PACKAGE = "dev.mflash.guides.mongo.repository";
public @Bean AccountCascadeMongoEventListener cascadeMongoEventListener() {
return new AccountCascadeMongoEventListener();
}
public @Bean MongoCustomConversions customConversions() {
return new MongoCustomConversions(ZonedDateTimeConverters.getConvertersToRegister());
} }
Testing the cascading
To verify if the cascading works, let’s write some tests.
// src/test/java/dev/mflash/guides/mongo/repository/AccountCascadeTest.java
@ExtendWith(SpringExtension.class)
@SpringBootTest class AccountCascadeTest {
private static final User SAMPLE_USER = User.builder().name("Jasmine Beck").email("jasmine@example.com").locale(
Locale.FRANCE).dateOfBirth(LocalDate.of(1995, Month.DECEMBER, 12)).build();
private static final Session SAMPLE_SESSION = Session.builder().city("Paris").locale(Locale.FRANCE).build();
private static final Account SAMPLE_ACCOUNT = Account.builder().user(SAMPLE_USER).session(SAMPLE_SESSION).created(
ZonedDateTime.now()).build();
private @Autowired AccountRepository accountRepository;
private @Autowired SessionRepository sessionRepository;
private @Autowired UserRepository userRepository;
private Account savedAccount;
@BeforeEach
void setUp() {
.deleteAll();
accountRepository.deleteAll();
sessionRepository.deleteAll();
userRepository= accountRepository.save(SAMPLE_ACCOUNT);
savedAccount
}
@Test
@DisplayName("Should cascade on save")
void shouldCascadeOnSave() {
final User savedUser = savedAccount.getUser();
final Optional<Session> savedSession = savedAccount.getSessions().stream().findAny();
final String userId = savedUser.getKey();
.findById(userId))
assertThat(userRepository-> assertThat(user).isEqualToIgnoringGivenFields(SAMPLE_USER, "key"));
.hasValueSatisfying(user
if (savedSession.isPresent()) {
final String sessionId = savedSession.get().getKey();
.findById(sessionId)).isNotEmpty()
assertThat(sessionRepository-> assertThat(session).isEqualToIgnoringGivenFields(SAMPLE_SESSION, "key"));
.hasValueSatisfying(session
}
.setLocale(Locale.CANADA);
savedUser.setUser(savedUser);
savedAccount.save(savedAccount);
accountRepository.findById(userId))
assertThat(userRepository-> assertThat(user.getLocale()).isEqualTo(Locale.CANADA));
.hasValueSatisfying(user
if (savedSession.isPresent()) {
final Session modifiedSession = savedSession.get();
.setCity("Nice");
modifiedSession.setSessions(Set.of(modifiedSession, Session.builder().city("Lyon").locale(Locale.FRANCE).build()));
savedAccountAccount modifiedAccount = accountRepository.save(savedAccount);
.findById(modifiedSession.getKey())).isNotEmpty()
assertThat(sessionRepository-> assertThat(session.getCity()).isEqualTo("Nice"));
.hasValueSatisfying(session .getSessions().stream().filter(s -> s.getCity().equals("Lyon")).findAny())
assertThat(modifiedAccount-> assertThat(sessionRepository.findById(session.getKey())).isNotEmpty()
.hasValueSatisfying(session -> assertThat(matchedSession.getCity()).isEqualTo("Lyon")));
.hasValueSatisfying(matchedSession
}
}
@Test
@DisplayName("Should not cascade on fetch")
void shouldNotCascadeOnFetch() {
final String userId = savedAccount.getUser().getKey();
final Set<Session> sessions = savedAccount.getSessions();
.findById(savedAccount.getKey());
accountRepository
.findById(userId)).isNotEmpty();
assertThat(userRepository.allSatisfy(session ->
assertThat(sessions).findById(session.getKey())).isNotEmpty());
assertThat(sessionRepository
}
@Test
@DisplayName("Should cascade on delete")
void shouldCascadeOnDelete() {
final Optional<Account> fetchedAccount = accountRepository.findById(savedAccount.getKey());
.deleteById(savedAccount.getKey());
accountRepository
assertThat(fetchedAccount)-> {
.hasValueSatisfying(account .findById(account.getUser().getKey())).isEmpty();
assertThat(userRepository.getSessions())
assertThat(account-> assertThat(sessionRepository.findById(session.getKey())).isEmpty());
.allSatisfy(session
});
} }
In this test class,
- we define some test data -
SAMPLE_USER
,SAMPLE_SESSION
andSAMPLE_ACCOUNT
. - we implement a
setup
method that removes all the documents from the repositories and saves theSAMPLE_ACCOUNT
before each test is run. - the test
shouldCascadeOnSave
verifies if the@DBRef
annotation correctly persists theSAMPLE_USER
andSAMPLE_SESSION
documents when theSAMPLE_ACCOUNT
is saved. Then it updates theUser
document in theSAMPLE_ACCOUNT
and checks if the same update appears in the document of theUser
collection for the givenid
. The same thing is done for theSession
document. - the test
shouldNotCascadeOnFetch
verifies that the cascade doesn’t happen when a document is fetched from the database. - the test
shouldCascadeOnDelete
verifies that once theSAMPLE_ACCOUNT
has been deleted, the correspondingUser
andSession
documents have also been deleted.
Source code
Corrections
- Thanks @CyberpunkPerson for pointing out that
onAfterConvert
can delete objects not only when the parent is deleted but also when the parent is fetched 🤦♀️! I’ve patched the code and updated the article with a fix.