Analyzing Latency Performance of Different Cache Methods for Microservice Architecture

Authors

  • Nur Ayuni Nor Sobri, Mohamad Aqib Haqmi Abas, Ihsan Mohd Yassin, Megat Syahirul Amin Megat Ali, Nooritawati Md. Tahir, Azlee Zabidi, Zairi Ismael Rizman

Abstract

Nowadays, due to popularization of the internet, the demands for a robust and scalable application system are increasing rapidly. Underlying backend of the application system are required to be scalable to allow thousands of users to be served concurrently. With the rise of microservice architecture, load balancing strategy can be used efficiently to distribute load evenly between service instances for service that runs on high load. Moreover, cache technology is also used heavily to reduce the load impact on databases. This paper investigates the use and performance result of cache layer as opposed to database only for storing data in a microservice setting where we propose on running multiple instances of our services for load balancing purposes in production. The result we obtained for a normal architecture that runs a single service instance in-process memory cache would be the most beneficial, while for microservice architecture with service that runs on multiple instances, Redis would give the best performance and data integrity.

Downloads

Published

2022-08-01