Building In-Memory Cache in Go
Get latest articles directly in your inbox
In the ever-evolving landscape of software development, performance is a crucial factor. Whether you’re building a web application, a microservice, or any other software system, optimizing for latency on your APIs (response times) can enhance user experience and reduce operational costs (CPU/Memory). One effective technique for achieving performance gains is through the use of in-memory caching. In this blog, we’ll deep dive into implementing an in-memory cache in Go (both using generics and non-generics way)
Introduction
Before going into implementation, let’s discuss about in-memory caching and it’s advantages.
In-memory caching involves storing frequently accessed data in memory for quick retrieval, rather than fetching it from a slower data store, such as a database, every time it’s needed. This can significantly reduce latency and improve overall system performance.
Go using it’s lightweight goroutines and channels make it straightforward to build highly concurrent systems, which is ideal for handling cache operations efficiently, especially in scenarios with high read and write throughput.
Note: There are some popular packages like go-cache that can be used directly. Purpose of this article is to learn on how to implement a working in-memory cache from scratch.
Implementation
Now, let’s explore how to implement a basic in-memory cache in Go. Since generics were introduced in Go 1.18
and it has been some time since generics launched, I highly recommend exploring and learning more about generics.
Using map[string]interface
For a Cache, we need 4 main functions -
Get()
- to access value by keySet()
- set value to a keyClear()
- clears/resets the cacheDelete()
- to remove specific key
We create a Cache struct and bind the 4 methods defined above. We use map[string]interface{}
here since it allows to store the value of any type corresponding to a string key. You can learn more about it here. Here’s the code for same with proper comments.
package main
import (
"fmt"
"sync"
"time"
)
// Cache represents an in-memory key-value store.
type Cache struct {
data map[string]interface{}
mu sync.RWMutex
}
// NewCache creates and initializes a new Cache instance.
func NewCache() *Cache {
return &Cache{
data: make(map[string]interface{}),
}
}
// Set adds or updates a key-value pair in the cache.
func (c *Cache) Set(key string, value interface{}) {
c.mu.Lock()
defer c.mu.Unlock()
c.data[key] = value
}
// Get retrieves the value associated with the given key from the cache.
func (c *Cache) Get(key string) (interface{}, bool) {
c.mu.RLock()
defer c.mu.RUnlock()
value, ok := c.data[key]
return value, ok
}
// Delete removes a key-value pair from the cache.
func (c *Cache) Delete(key string) {
c.mu.Lock()
defer c.mu.Unlock()
delete(c.data, key)
}
// Clear removes all key-value pairs from the cache.
func (c *Cache) Clear() {
c.mu.Lock()
defer c.mu.Unlock()
c.data = make(map[string]interface{})
}
func main() {
cache := NewCache()
// Adding data to the cache
cache.Set("key1", "value1")
cache.Set("key2", 123)
// Retrieving data from the cache
if val, ok := cache.Get("key1"); ok {
fmt.Println("Value for key1:", val)
}
// Deleting data from the cache
cache.Delete("key2")
// Clearing the cache
cache.Clear()
time.Sleep(time.Second) // Sleep to allow cache operations to complete
}
Nice! We have a working cache. We used mutex for locking to avoid concurrent access. Feel free to play around with this. If you notice, our cache doesn’t support the concept of expiry of keys (TTL). Let’s add expiry support to our cache which means after a defined TTL (time to live) the item expires and is removed from the cache.
Cache with Expiry (TTL)
Since each cache entry can have a defined TTL, we create a CacheItem
struct that contains interface{}
for storing value and an expiry time field.
package main
import (
"fmt"
"sync"
"time"
)
// CacheItem represents an item stored in the cache with its associated TTL.
type CacheItem struct {
value interface{}
expiry time.Time // TTL for a key
}
// Cache represents an in-memory key-value store with expiry support.
type Cache struct {
data map[string]CacheItem
mu sync.RWMutex
}
// NewCache creates and initializes a new Cache instance.
func NewCache() *Cache {
return &Cache{
data: make(map[string]CacheItem),
}
}
// Set adds or updates a key-value pair in the cache with the given TTL.
func (c *Cache) Set(key string, value interface{}, ttl time.Duration) {
c.mu.Lock()
defer c.mu.Unlock()
c.data[key] = CacheItem{
value: value,
expiry: time.Now().Add(ttl),
}
}
// Get retrieves the value associated with the given key from the cache.
// It also checks for expiry and removes expired items.
func (c *Cache) Get(key string) (interface{}, bool) {
c.mu.Lock()
defer c.mu.Unlock()
item, ok := c.data[key]
if !ok {
return nil, false
}
// item found - check for expiry
if item.expiry.Before(time.Now()) {
// remove entry from cache if time is beyond the expiry
delete(c.data, key)
return nil, false
}
return item.value, true
}
// Delete removes a key-value pair from the cache.
func (c *Cache) Delete(key string) {
c.mu.Lock()
defer c.mu.Unlock()
delete(c.data, key)
}
// Clear removes all key-value pairs from the cache.
func (c *Cache) Clear() {
c.mu.Lock()
defer c.mu.Unlock()
c.data = make(map[string]CacheItem)
}
func main() {
cache := NewCache()
// Adding data to the cache with a TTL of 2 seconds
cache.Set("name", "mohit", 2*time.Second)
cache.Set("weight", 75, 5*time.Second)
// Retrieving data from the cache
if val, ok := cache.Get("name"); ok {
fmt.Println("Value for name:", val)
}
// Wait for some time to see expiry in action
time.Sleep(3 * time.Second)
// Retrieving expired data from the cache
if _, ok := cache.Get("name"); !ok {
fmt.Println("Name key has expired")
}
// Retrieving data before expiry
if val, ok := cache.Get("weight"); ok {
fmt.Println("Value for weight before expiry:", val)
}
// Wait for some time to see expiry in action
time.Sleep(3 * time.Second)
// Retrieving expired data from the cache
if _, ok := cache.Get("weight"); !ok {
fmt.Println("Weight key has expired")
}
// Deleting data from the cache
cache.Set("key", "val", 2*time.Second)
cache.Delete("key")
// Clearing the cache
cache.Clear()
time.Sleep(time.Second) // Sleep to allow cache operations to complete
}
With this expiry support, our cache implementation becomes more versatile and suitable for a wider range of caching scenarios.
In-memory cache using Generics
Before proceeding with implementation, if you are to generics, then I recommend to start with the basics of generics and then come back and learn to implement this.
Logically it’s similar, only instead of using interface{}
we leverage generics using any
type. Here’s the code for generics implementation.
package main
import (
"fmt"
"sync"
"time"
)
// Requires go >= 1.18
// This implementation uses Go Generics
// CacheItem represents an item stored in the cache with its associated TTL.
type CacheItem[T any] struct {
value T
expiry time.Time
}
// Cache represents an in-memory key-value store with expiry support.
type Cache[K comparable, T any] struct {
data map[K]CacheItem[T] // stores cache items
mu sync.RWMutex // managing concurrent access
}
// NewCache creates and initializes a new Cache instance.
func NewCache[K comparable, T any]() *Cache[K, T] {
return &Cache[K, T]{
data: make(map[K]CacheItem[T]),
}
}
// Set adds or updates a key-value pair in the cache with the given TTL.
func (c *Cache[K, T]) Set(key K, value T, ttl time.Duration) {
c.mu.Lock()
defer c.mu.Unlock()
c.data[key] = CacheItem[T]{
value: value,
expiry: time.Now().Add(ttl),
}
}
func zeroVal[T any]() T {
var zero T
return zero
}
// Get retrieves the value associated with the given key from the cache.
// It also checks for expiry and removes expired items.
func (c *Cache[K, T]) Get(key K) (T, bool) {
c.mu.Lock()
defer c.mu.Unlock()
item, ok := c.data[key]
if !ok {
return zeroVal[T](), false
}
// item found - check for expiry
if item.expiry.Before(time.Now()) {
// remove entry from cache if time is beyond the expiry
delete(c.data, key)
return zeroVal[T](), false
}
return item.value, true
}
// Delete removes a key-value pair from the cache.
func (c *Cache[K, T]) Delete(key K) {
c.mu.Lock()
defer c.mu.Unlock()
delete(c.data, key)
}
// Clear removes all key-value pairs from the cache.
func (c *Cache[K, T]) Clear() {
c.mu.Lock()
defer c.mu.Unlock()
c.data = make(map[K]CacheItem[T])
}
func main() {
cache := NewCache[string, int]()
// Adding data to the cache with a TTL of 2 seconds
cache.Set("height", 175, 2*time.Second)
cache.Set("weight", 75, 5*time.Second)
// Retrieving data from the cache
if val, ok := cache.Get("height"); ok {
fmt.Println("Value for height:", val)
}
// Wait for some time to see expiry in action
time.Sleep(3 * time.Second)
// Retrieving expired data from the cache
if _, ok := cache.Get("height"); !ok {
fmt.Println("height key has expired")
}
// Retrieving data before expiry
if val, ok := cache.Get("weight"); ok {
fmt.Println("Value for weight before expiry:", val)
}
// Wait for some time to see expiry in action
time.Sleep(3 * time.Second)
// Retrieving expired data from the cache
if _, ok := cache.Get("weight"); !ok {
fmt.Println("Weight key has expired")
}
// Deleting data from the cache
cache.Set("key", 1, 2*time.Second)
cache.Delete("key")
// Clearing the cache
cache.Clear()
time.Sleep(time.Second) // Sleep to allow cache operations to complete
}
Awesome! Congrats on building a generic in-memory cache. You can start using this in your Go applications. One thing to note here - we are removing keys during Get()
calls. There can be cases - where your key is expired but is still consuming memory in your map. To solve for this, you can run a regular interval goroutine that checks for expired keys and removes them from your cache. I’ll leave this as an exercise for you all. Feel free to reach out in case you’re stuck or want to share implementation.
Resources
- Popular Package for in-memory cache in go
- Introduction to Generics
- Using Redis as cache in Go
- Open source cache implementations
Books to learn Golang
Do explore articles on Golang and System Design. You’ll learn something new 💡
Liked the article? Consider supporting me ☕️
I hope you learned something new. Feel free to suggest improvements ✔️
Follow me on Twitter for updates and resources. Let’s connect!
Keep exploring 🔎 Keep learning 🚀
Liked the content? Do support :)