Open
Description
Description
Redis is known for its speed, and popular applications like Instagram rely on Redis for optimal performance. However, when I integrated Redis into my API, I observed that it only marginally improved the response time. Initially, my regular API, which uses MongoDB, responded with an average of 302ms. After implementing Redis, the response time decreased to an average of 298ms.
Now, I'm seeking guidance on the proper implementation of Redis in my Node.js API. Could you please provide some advice on how to achieve optimal utilization of Redis within my API?
(async () => {
await redisClient.connect();
})();
redisClient.on('connect', () => console.log('::> Redis Client Connected'));
redisClient.on('error', (err) => console.log('<:: Redis Client Error', err));
router.get('/posts', async (req, res) => {
try {
const numBuckets = 10;
const cacheTTL = 60;
const bucketIndex = Math.floor(Math.random() * numBuckets);
const cacheKey = `posts:${bucketIndex}:${req.query.page || 1}`;
const cachedData = await redisClient.json.get(cacheKey, '$');
if (cachedData) {
console.log('Data fetched from Redis cache');
return res.json(cachedData);
}
const limit = 30
const page = Number(req.query.page) || 1;
const skip = (page - 1) * limit;
const result = await User.aggregate([
{ $project: { _id: 0, username: 1, profileImg: 1, posts: 1 } },
{ $unwind: '$posts' },
{ $project: { postImage: '$posts.post', date: '$posts.date', username: 1, profileImg: 1 } },
{ $sort: { [bucketIndex % 2 === 0 ? 'date' : 'postImage']: bucketIndex % 2 === 0 ? -1 : 1 } },
{ $skip: skip },
{ $limit: limit },
]);
await Promise.all([
redisClient.json.set(cacheKey, '$', result),
redisClient.expire(cacheKey, cacheTTL)
]);
console.log('Data fetched from MongoDB and cached in Redis');
res.json(result);
} catch (err) {
console.error(err);
res.status(500).json({ message: 'Internal server error' });
}
});
Node.js Version
v20
Redis Server Version
Redis Labs
Node Redis Version
4.6.7
Platform
MacOS
Logs
No response