This program demonstrates the design and implementation of an LRU cache using Go. The LRU cache discards the least recently used items first and is commonly used to protect against excessive memory usage when items are no longer needed.

Program Code

package main

import (
    "container/list"
    "fmt"
)

type LRUCache struct {
    capacity int
    cache    map[int]*list.Element
    list     *list.List
}

type entry struct {
    key   int
    value int
}

// NewLRUCache creates a new LRU Cache
func NewLRUCache(capacity int) *LRUCache {
    return &LRUCache{
        capacity: capacity,
        cache:    make(map[int]*list.Element),
        list:     list.New(),
    }
}

// Get retrieves the value (will always be positive) of the key if the key exists in the cache, otherwise return -1.
func (c *LRUCache) Get(key int) int {
    if ele, found := c.cache[key]; found {
        c.list.MoveToFront(ele)
        return ele.Value.(*entry).value
    }
    return -1
}

// Put sets the value if the key is not already present. When the cache reaches its capacity, it should invalidate the least recently used item before inserting a new item.
func (c *LRUCache) Put(key int, value int) {
    if ele, found := c.cache[key]; found {
        c.list.MoveToFront(ele)
        ele.Value.(*entry).value = value
        return
    }

    if c.list.Len() == c.capacity {
        ele := c.list.Back()
        c.list.Remove(ele)
        delete(c.cache, ele.Value.(*entry).key)
    }

    ele := c.list.PushFront(&entry{key, value})
    c.cache[key] = ele
}

func main() {
    cache := NewLRUCache(2)
    cache.Put(1, 1)
    cache.Put(2, 2)
    fmt.Println(cache.Get(1)) // returns 1
    cache.Put(3, 3)           // evicts key 2
    fmt.Println(cache.Get(2)) // returns -1 (not found)
    cache.Put(4, 4)           // evicts key 1
    fmt.Println(cache.Get(1)) // returns -1 (not found)
    fmt.Println(cache.Get(3)) // returns 3
    fmt.Println(cache.Get(4)) // returns 4
}

Explanation

  • Structure: The LRUCache struct has an integer for capacity, a hash map for O(1) access, and a doubly linked list to maintain the order of usage.
  • Cache Operations:
    • Get operation accesses an element, and if it exists, moves it to the front of the list to mark it as the most recently used.
    • Put operation checks if the element exists; if it does, it updates the value and moves it to the front. If not, it adds a new element. If the cache is at capacity, it removes the least recently used item from the back of the list.
  • Doubly Linked List: The list supports both front and back removals efficiently, which is critical for fast accesses and deletions.

 

Running the Program

To run this program:

  1. Save the code in a file with a .go extension, like lrucache.go.
  2. Open a terminal or command prompt.
  3. Navigate to the directory where the file is saved.
  4. Type go run lrucache.go to execute the program.

The output will demonstrate how the LRU cache operates, showing the results of various Get and Put operations, including cases where items are evicted due to the cache reaching its capacity. Adjust the main function to test different scenarios and cache capacities.

By Aditya Bhuyan

I work as a cloud specialist. In addition to being an architect and SRE specialist, I work as a cloud engineer and developer. I have assisted my clients in converting their antiquated programmes into contemporary microservices that operate on various cloud computing platforms such as AWS, GCP, Azure, or VMware Tanzu, as well as orchestration systems such as Docker Swarm or Kubernetes. For over twenty years, I have been employed in the IT sector as a Java developer, J2EE architect, scrum master, and instructor. I write about Cloud Native and Cloud often. Bangalore, India is where my family and I call home. I maintain my physical and mental fitness by doing a lot of yoga and meditation.

Leave a Reply

Your email address will not be published. Required fields are marked *

error

Enjoy this blog? Please spread the word :)