Home > Uncategorized > Spring Cache Abstraction as a Hibernate Cache Provider

Spring Cache Abstraction as a Hibernate Cache Provider

Many of the projects I’ve worked on over my career have leveraged a Spring/Hibernate stack. Over that time, the Spring/Hibernate integration has greatly improved making the once tedious and repetitive chore of setting up a new project (and maintaining an existing one through upgrades, expansion, and refactoring) far simpler. Now it’s as simple as going to the Spring Initializr and selecting the right options. In seconds, you get a nicely configured Spring Boot boilerplate and you’re ready for the real work to begin.

Well, most of the setup is simple at least… setting up the Hibernate cache is still not simple. To get decent performance on Hibernate, the query cache and/or second level caches are key. But they’re a pain to configure and that configuration is almost entirely redundant with the Spring cache configuration.

Spring has a cache abstraction that can back to ehcache, redis, memcache, caffeine, guava, and a variety of other implementations – including just a simple ConcurrentHashMap (which is great for test harnesses). Spring Boot makes it really easy to setup. Instead of setting up the Spring cache and the Hibernate cache totally separately, why not use the same configuration and same implementations for both?

That’s what Hibernate SpringCache does. Set up Spring Cache, add the Hibernate SpringCache dependency to your Spring Boot project, tell Hibernate to enable the query cache and/or second level cache, and you’re done.

Hopefully, Hibernate itself will one day include this functionality. If you’d like to see Hibernate/Spring integration become that much simpler, let your thoughts be known by commenting on the pull request.

CC BY-SA 4.0 Spring Cache Abstraction as a Hibernate Cache Provider by Craig Andrews is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Categories: Uncategorized Tags:
  1. No comments yet.
  1. No trackbacks yet.