Setting Up Redis Caching for High-Performance Node.js Applications
Complete guide to installing, configuring, and implementing Redis for caching, sessions, and real-time data in Node.js applications on VPS.
Setting Up Redis Caching for High-Performance Node.js Applications
Redis is an in-memory data store that dramatically improves application performance through caching. This guide covers installation, configuration, and implementation patterns for Node.js applications.
Why Redis?
Redis provides sub-millisecond response times compared to database queries. It's perfect for caching, session management, real-time leaderboards, job queues, and rate limiting. Unlike traditional databases, Redis operations occur entirely in RAM.
Prerequisites
- Linux VPS (Ubuntu 20.04+ or Debian 11+)
- Node.js 18+ installed
- Access to port 6379
- Basic understanding of caching patterns
Step 1: Install Redis
Update packages:
sudo apt-get update
sudo apt-get upgrade -y
Install Redis:
sudo apt-get install redis-server -y
Verify installation:
redis-cli --version
Step 2: Start Redis Service
Start the Redis server:
sudo systemctl start redis-server
Enable auto-start:
sudo systemctl enable redis-server
Check service status:
sudo systemctl status redis-server
Step 3: Test Redis Connection
Connect to Redis CLI:
redis-cli
Test basic commands:
ping
# PONG
set mykey "Hello"
get mykey
# "Hello"
del mykey
exit
Step 4: Configure Redis for Production
Edit Redis configuration:
sudo nano /etc/redis/redis.conf
Key configurations:
# Set maximum memory
maxmemory 512mb
# Set eviction policy when max memory is reached
maxmemory-policy allkeys-lru
# Enable persistent storage
save 900 1
save 300 10
save 60 10000
# AOF persistence (more durable)
appendonly yes
appendfilename "appendonly.aof"
# Require password for security
requirepass your_secure_password_here
# Only listen on localhost (if not using dedicated Redis server)
bind 127.0.0.1
# Set port
port 6379
Apply configuration:
sudo systemctl restart redis-server
Verify:
redis-cli ping
# NOAUTH Authentication required.
Step 5: Node.js Redis Client Setup
Install Redis client:
npm install redis
Create Redis connection module (src/lib/redis.ts):
import { createClient, RedisClientType } from 'redis';
let redisClient: RedisClientType;
export async function initializeRedis() {
redisClient = createClient({
socket: {
host: process.env.REDIS_HOST || 'localhost',
port: parseInt(process.env.REDIS_PORT || '6379'),
reconnectStrategy: (retries) => Math.min(retries * 50, 500),
},
password: process.env.REDIS_PASSWORD,
});
redisClient.on('error', (err) => {
console.error('Redis Client Error', err);
});
redisClient.on('connect', () => {
console.log('Connected to Redis');
});
await redisClient.connect();
return redisClient;
}
export function getRedis() {
if (!redisClient) {
throw new Error('Redis not initialized. Call initializeRedis() first.');
}
return redisClient;
}
export async function closeRedis() {
if (redisClient) {
await redisClient.quit();
}
}
Step 6: Implement Basic Caching
Create caching utility (src/lib/cache.ts):
import { getRedis } from './redis';
const DEFAULT_TTL = 3600; // 1 hour
export async function getCache<T>(key: string): Promise<T | null> {
try {
const redis = getRedis();
const cached = await redis.get(key);
return cached ? JSON.parse(cached) : null;
} catch (error) {
console.error('Cache get error:', error);
return null;
}
}
export async function setCache<T>(
key: string,
value: T,
ttl: number = DEFAULT_TTL
): Promise<boolean> {
try {
const redis = getRedis();
await redis.setEx(key, ttl, JSON.stringify(value));
return true;
} catch (error) {
console.error('Cache set error:', error);
return false;
}
}
export async function invalidateCache(pattern: string): Promise<number> {
try {
const redis = getRedis();
const keys = await redis.keys(pattern);
if (keys.length === 0) return 0;
return await redis.del(keys);
} catch (error) {
console.error('Cache invalidation error:', error);
return 0;
}
}
export async function incrementCounter(
key: string,
amount: number = 1
): Promise<number> {
try {
const redis = getRedis();
return await redis.incrBy(key, amount);
} catch (error) {
console.error('Counter increment error:', error);
return 0;
}
}
Step 7: Use Caching in Express Routes
Example API endpoint with caching (src/routes/users.ts):
import express, { Request, Response } from 'express';
import { getCache, setCache, invalidateCache } from '../lib/cache';
import { db } from '../database';
const router = express.Router();
// Get user with caching
router.get('/users/:id', async (req: Request, res: Response) => {
const { id } = req.params;
const cacheKey = `user:${id}`;
try {
// Check cache first
const cachedUser = await getCache(cacheKey);
if (cachedUser) {
return res.json({ data: cachedUser, source: 'cache' });
}
// Query database
const user = await db.query(
'SELECT id, email, username FROM users WHERE id = $1',
[id]
);
if (!user.rows.length) {
return res.status(404).json({ error: 'User not found' });
}
// Cache for 1 hour
await setCache(cacheKey, user.rows[0], 3600);
res.json({ data: user.rows[0], source: 'database' });
} catch (error) {
console.error('Error fetching user:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
// Create user with cache invalidation
router.post('/users', async (req: Request, res: Response) => {
const { email, username } = req.body;
try {
const result = await db.query(
'INSERT INTO users (email, username) VALUES ($1, $2) RETURNING *',
[email, username]
);
// Invalidate user list cache
await invalidateCache('user:*');
await invalidateCache('users:list:*');
res.status(201).json(result.rows[0]);
} catch (error) {
console.error('Error creating user:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
export default router;
Step 8: Session Management with Redis
Install session middleware:
npm install express-session connect-redis
Setup sessions (src/middleware/session.ts):
import session from 'express-session';
import RedisStore from 'connect-redis';
import { getRedis } from '../lib/redis';
export function setupSession(app) {
const redisClient = getRedis();
const redisStore = new RedisStore({
client: redisClient,
prefix: 'session:',
});
app.use(
session({
store: redisStore,
secret: process.env.SESSION_SECRET || 'your-secret-key',
resave: false,
saveUninitialized: false,
cookie: {
secure: process.env.NODE_ENV === 'production',
httpOnly: true,
maxAge: 24 * 60 * 60 * 1000, // 24 hours
sameSite: 'lax',
},
})
);
}
Step 9: Rate Limiting with Redis
Create rate limiter (src/middleware/rateLimit.ts):
import { Request, Response, NextFunction } from 'express';
import { getRedis } from '../lib/redis';
export async function rateLimit(
windowMs: number = 60000, // 1 minute
maxRequests: number = 100
) {
return async (req: Request, res: Response, next: NextFunction) => {
try {
const redis = getRedis();
const key = `rate-limit:${req.ip}`;
const current = await redis.incr(key);
if (current === 1) {
await redis.expire(key, Math.ceil(windowMs / 1000));
}
res.setHeader('X-RateLimit-Limit', maxRequests);
res.setHeader('X-RateLimit-Remaining', Math.max(0, maxRequests - current));
if (current > maxRequests) {
return res.status(429).json({
error: 'Too many requests',
retryAfter: await redis.ttl(key),
});
}
next();
} catch (error) {
console.error('Rate limit error:', error);
next();
}
};
}
Use in Express:
import { rateLimit } from './middleware/rateLimit';
app.use('/api/', rateLimit(60000, 100)); // 100 requests per minute
Step 10: Job Queue with Redis
Install Bull queue library:
npm install bull
Create job queue (src/lib/queue.ts):
import Bull from 'bull';
import { getRedis } from './redis';
const redisConfig = {
redis: {
host: process.env.REDIS_HOST || 'localhost',
port: parseInt(process.env.REDIS_PORT || '6379'),
password: process.env.REDIS_PASSWORD,
},
};
export const emailQueue = new Bull('emails', redisConfig);
export const webhookQueue = new Bull('webhooks', redisConfig);
// Process email jobs
emailQueue.process(async (job) => {
console.log('Processing email job:', job.id);
const { to, subject, body } = job.data;
// Send email logic here
await sendEmail(to, subject, body);
return { success: true, emailId: job.id };
});
// Handle failures
emailQueue.on('failed', (job, error) => {
console.error(`Job ${job.id} failed:`, error);
});
// Add job to queue
export async function queueEmail(
to: string,
subject: string,
body: string
) {
await emailQueue.add({ to, subject, body }, {
attempts: 3,
backoff: {
type: 'exponential',
delay: 2000,
},
});
}
Use in application:
router.post('/send-email', async (req, res) => {
const { email, subject, message } = req.body;
await queueEmail(email, subject, message);
res.json({ message: 'Email queued for processing' });
});
| Data Type | Use Case | TTL Example |
|---|---|---|
| String | User profiles, API responses | 1 hour |
| Hash | Session data, object storage | 24 hours |
| List | Job queues, activity feeds | 7 days |
| Set | User tags, unique visitors | 30 days |
| Sorted Set | Leaderboards, rate limiting | Variable |
Monitoring and Maintenance
Check Redis memory usage:
redis-cli info memory
Monitor key operations:
redis-cli monitor
Clear all keys (use cautiously):
redis-cli flushall
Production Deployment with Docker
Dockerfile for Redis:
FROM redis:7-alpine
RUN apk add --no-cache \
bash \
curl
COPY redis.conf /usr/local/etc/redis/redis.conf
HEALTHCHECK --interval=5s --timeout=3s --start-period=5s --retries=5 \
CMD redis-cli ping || exit 1
CMD ["redis-server", "/usr/local/etc/redis/redis.conf"]
Docker Compose:
services:
redis:
image: redis:7-alpine
container_name: redis
ports:
- "6379:6379"
volumes:
- redis_data:/data
command: redis-server --requirepass ${REDIS_PASSWORD}
restart: unless-stopped
volumes:
redis_data:
Best Practices
- Set memory limits: Prevent unbounded growth with
maxmemory - Use TTL: Always set expiration times on cache keys
- Monitor memory: Watch for growth and adjust eviction policy
- Backup regularly: Enable AOF persistence for durability
- Use connection pooling: Reuse connections across requests
- Name keys consistently: Use prefixes like
user:123,post:456 - Avoid large values: Keep cached objects reasonably sized
Useful Resources
Conclusion
Redis transforms application performance through intelligent caching and session management. Start with simple caching patterns, then expand to sessions, job queues, and rate limiting as your needs grow.