Skip to content

Latest commit

 

History

History
53 lines (40 loc) · 2.49 KB

README.md

File metadata and controls

53 lines (40 loc) · 2.49 KB

Cache Strategies Benchmark

One of the first things that we think for optimizing api's is caching, but we need be aware of what cache strategy suites each case better.

About

This repository consist of a load test using Artillery, a open-source load test toolkit, in a simple microservice using in-process and distributed cache strategies aiming to compare and measure latency impact between both.

You should not decide your cache strategy only by measuring latency. Each strategy has it's own drawbacks, for example, despite latency a in-process cache strategy do not suits well for an environment that you need shared cache data. For a deep understading of this subject I recommend the following article.

Motivation

Some days ago talking with @desk467, he pointed out how a in-process cache strategy could have better performance compared to a distributed cache strategy in some cases. Based on this conversation, I have decided to measure the latency impact on reading operations using node-cache and redis for in-process and distributed strategy respectively in a simplified environment.

How to run

First you need npm, docker and docker-compose installed

Next, install Artillery

npm install -g artillery

Next, launch the services

docker-compose up -d

Next, make the script for run the load test in docker executable

chmod +x ./scripts/search-messages-docker.sh

Next, run the load test for each service

./scripts/search-message-docker.sh <service-name>

i.e

./scripts/search-message-docker.sh in-process-server
./scripts/search-message-docker.sh redis-server 

In-Process Cache Strategy Benchmark

in-process-cache-benchmark png

Redis (Cloud Hosted) Cache Strategy Benchmark

redis-cloud-benchmark png

Redis (Local) Cache Strategy Benchmark

redis-local-benchmark

License

The MIT License (MIT)

Copyright (c) 2021 Lucas Machado