There are quite a few things we can do when tackling our application’s performance. We sometimes can make our code faster and optimize the database queries. To make our API even more performant, we might want to completely avoid running some of the code. Accessing the data stored in the database is quite often time-consuming. It adds up if we also perform some data manipulation on top of it before returning it to the user. Fortunately, we can improve our approach with caching. By storing a copy of the data in a way that it can be served faster, we can speed up the response in a significant way. Implementing in-memory cache The most straightforward way to implement cache is to store the data in the memory of our application. Under the hood, NestJS uses the cache-manager library. We need to start by installing it.npm install cache-managerTo enable the cache, we need to import the CacheModule in our app. posts.module.ts import { CacheModule, Module } from '@nestjs/common'; import PostsController from './posts.controller'; import PostsService from './posts.service'; import Post from './post.entity'; import { TypeOrmModule } from '@nestjs/typeorm'; import { SearchModule } from '../search/search.module'; import PostsSearchService from './postsSearch.service'; @Module({ imports: [ CacheModule.register(), TypeOrmModule.forFeature([Post]), SearchModule, ], controllers: [PostsController], providers: [PostsService, PostsSearchService], }) export class PostsModule {}By default, the amount of time that a response is cached before deleting it is 5 seconds. Also, the maximum number of elements in the cache is 100 by default. We can change those values by passing additional options to the CacheModule.register() method.CacheModule.register({ ttl: 5, max: 100 }); Automatically caching responses NestJS comes equipped with the CacheInterceptor. With it, NestJS handles the cache automatically. posts.controller.ts import { Controller, Get, UseInterceptors, ClassSerializerInterceptor, Query, CacheInterceptor, } from '@nestjs/common'; import PostsService from './posts.service'; import { PaginationParams } from '../utils/types/paginationParams'; @Controller('posts') @UseInterceptors(ClassSerializerInterceptor) export default class PostsController { constructor( private readonly postsService: PostsService ) {} @UseInterceptors(CacheInterceptor) @Get() async getPosts( @Query('search') search: string, @Query() { offset, limit, startId }: PaginationParams ) { if (search) { return this.postsService.searchForPosts(search, offset, limit, startId); } return this.postsService.getAllPosts(offset, limit, startId); } // ... }If we call this endpoint two times, NestJS does not invoke the getPosts method twice. Instead, it returns the cached data the second time. In the twelfth part of this series, we’ve integrated Elasticsearch into our application. Also, in the seventeenth part, we’ve added pagination. Therefore, our /posts endpoint accepts quite a few query params. A very important thing that the official documentation does not mention is that NestJS will store the response of the getPosts method separately for every combination of query params. Thanks to that, calling /posts?search=Hello and /posts?search=World can yield different responses. Although above, we use CacheInterceptor for a particular endpoint, we can also use it for the whole controller. We could even use it for a whole module. Using cache might sometimes cause us to return stale data, though. Therefore, we need to be careful about what endpoint do we cache. Using the cache store manually Aside from using the automatic cache, we can also interact with the cache manually. Let’s inject it into our service. posts.service.ts import { CACHE_MANAGER, Inject, Injectable } from '@nestjs/common'; import Post from './post.entity'; import { InjectRepository } from '@nestjs/typeorm'; import { Repository } from 'typeorm'; import PostsSearchService from './postsSearch.service'; @Injectable() export default class PostsService { constructor( @InjectRepository(Post) private postsRepository: Repository, private postsSearchService: PostsSearchService, @Inject(CACHE_MANAGER) private cacheManager: Cache ) {} // ... }An important concept to grasp is that the cache manager provides a key-value store. We can: retrieve the values using the cacheManager.get('key') method, add items using cacheManager.set('key', 'value), remove elements with cacheManager.del('key'), clear the whole cache using cacheManager.reset(). It can come in handy for more sophisticated cases. We can even use it together with the automatic cache. Invalidating cache If we would like to increase the time in which our cache lives, we need to figure out a way to invalidate it. If we want to cache the list of our posts, we need to refresh it every time a post is added, modified, or removed. To do use the cacheManager.del function to remove the cache, we need to know the key. The CacheInterceptor under the hood creates a key for every route we cache. This means that it creates separate cache keys both for /posts and /posts?search=Hello. Instead of relying on CacheInterceptor to generate a key for every route, we can define it ourselves with the @CacheKey decorator. We can also use @CacheTTL to increase the time during which the cache lives. postsCacheKey.constant.ts export const GET_POSTS_CACHE_KEY = 'GET_POSTS_CACHE'; posts.controller.ts @UseInterceptors(CacheInterceptor) @CacheKey(GET_POSTS_CACHE_KEY) @CacheTTL(120) @Get() async getPosts( @Query('search') search: string, @Query() { offset, limit, startId }: PaginationParams ) { if (search) { return this.postsService.searchForPosts(search, offset, limit, startId); } return this.postsService.getAllPosts(offset, limit, startId); }The above creates a big issue, though. Because now our custom key is always used for the getPosts method, it means that different query parameters yield the same result. Both /posts and /posts?search=Hello now use the same cache. To fix this, we need to extend the CacheInterceptor class and change its behavior slightly. The trackBy method of the CacheInterceptor returns a key that is used within the store. Instead of returning the cache key, let’s add the query params to it. To view the original trackBy method, check out this file in the repository. httpCache.interceptor.ts import { CACHE_KEY_METADATA, CacheInterceptor, ExecutionContext, Injectable } from '@nestjs/common'; @Injectable() export class HttpCacheInterceptor extends CacheInterceptor { trackBy(context: ExecutionContext): string | undefined { const cacheKey = this.reflector.get( CACHE_KEY_METADATA, context.getHandler(), ); if (cacheKey) { const request = context.switchToHttp().getRequest(); return `${cacheKey}-${request._parsedUrl.query}`; } return super.trackBy(context); } } request._parsedUrl property is created by the parseurl library If we don’t provide the @CacheKey decorator with a key, NestJS will use the original trackBy method through super.trackBy(context). Otherwise, the HttpCacheInterceptor will create keys like POSTS_CACHE-null and POSTS_CACHE-search=Hello. Now we can create a clearCache method and use it when we create, update, and delete posts.import { CACHE_MANAGER, Inject, Injectable } from '@nestjs/common'; import CreatePostDto from './dto/createPost.dto'; import Post from './post.entity'; import UpdatePostDto from './dto/updatePost.dto'; import { InjectRepository } from '@nestjs/typeorm'; import { Repository } from 'typeorm'; import PostNotFoundException from './exceptions/postNotFound.exception'; import User from '../users/user.entity'; import PostsSearchService from './postsSearch.service'; import { Cache } from 'cache-manager'; import { GET_POSTS_CACHE_KEY } from './postsCacheKey.constant'; @Injectable() export default class PostsService { constructor( @InjectRepository(Post) private postsRepository: Repository, private postsSearchService: PostsSearchService, @Inject(CACHE_MANAGER) private cacheManager: Cache ) {} async clearCache() { const keys: string[] = await this.cacheManager.store.keys(); keys.forEach((key) => { if (key.startsWith(GET_POSTS_CACHE_KEY)) { this.cacheManager.del(key); } }) } async createPost(post: CreatePostDto, user: User) { const newPost = await this.postsRepository.create({ ...post, author: user }); await this.postsRepository.save(newPost); this.postsSearchService.indexPost(newPost); await this.clearCache(); return newPost; } async updatePost(id: number, post: UpdatePostDto) { await this.postsRepository.update(id, post); const updatedPost = await this.postsRepository.findOne(id, { relations: ['author'] }); if (updatedPost) { await this.postsSearchService.update(updatedPost); await this.clearCache(); return updatedPost; } throw new PostNotFoundException(id); } async deletePost(id: number) { const deleteResponse = await this.postsRepository.delete(id); if (!deleteResponse.affected) { throw new PostNotFoundException(id); } await this.postsSearchService.remove(id); await this.clearCache(); } // ... }By doing the above, we invalidate our cache when the list of posts should change. With that, we can increase the Time To Live (TTL) and increase our application’s performance. Summary In this article, we’ve implemented an in-memory cache both by using the auto-caching and interacting with the cache store manually. Thanks to adjusting the way NestJS tracks cache responses, we’ve also been able to appropriately