Advertisement
webdev Is your database under heavy load? Learn how Redis caching reduced database queries by 80% in a real-world Laravel and Node.js project. Step-by-step problem-solving guide for web developers.

How I Reduced Database Load by 80% Using Redis Caching

5 Min Read Verified Content

🛑 The Problem — “Why Is the Database Server Always Busy?”

A client messaged me late at night:

“Our database CPU keeps spiking to 90–100%. The site works, but everything feels sluggish — especially during peak traffic.”

The tech stack was:

• Laravel backend
• Node.js microservice
• PostgreSQL database
• Redis already installed — but barely used

So the infrastructure was there… but the caching strategy wasn’t.

After checking metrics, I noticed something interesting:

❗ 70–80% of database traffic came from READ queries

Mostly:

• product lists
• dashboards
• user profile data

These weren’t changing every second — but they were being queried constantly.

So we had the perfect use-case for Redis.


🔍 Step 1 — Confirm the Issue With Logs

I enabled slow query logging and request profiling.

Some endpoints were called thousands of times:

GET /products GET /users/profile GET /dashboard/stats

And every single request hit the database.

Even when data hadn’t changed.

That’s like asking the same question over and over again… all day.

No wonder PostgreSQL was tired.


⚡ Step 2 — Decide What to Cache (This Part Matters)

Good caching is strategic, not random.

So I answered three questions:

1️⃣ Does the data change frequently?

→ No. Product list changes maybe a few times per day.

2️⃣ Does it get requested often?

→ Yes. Thousands of times per hour.

3️⃣ Is slightly stale data acceptable?

→ Yes. A 1–5 minute delay is fine.

Perfect candidate for Redis.


🛠 Step 3 — Implement Redis Cache in Laravel

Before caching, the controller looked like this:

$products = Product::with('category')->get(); return response()->json($products);

Every call = full DB query.

So I wrapped it in Redis caching:

use Illuminate\Support\Facades\Cache; $products = Cache::remember('products_all', 300, function () { return Product::with('category')->get(); }); return response()->json($products);

Meaning:

✔ First request → query DB → store in Redis
✔ Next requests (for 5 minutes) → served instantly

The difference?

Database queries

12,000/hour → 2,000/hour

Response time

220ms → 40ms

CPU load

Dropped by ~50%

Users noticed. The site felt snappier.


🔁 Step 4 — Invalidate Cache When Needed

Caching means nothing if data becomes stale forever.

So when products update, we clear the cache.

Laravel example:

Cache::forget('products_all');

I triggered this inside:

• Product update
• Product delete
• New product creation

That way:

✔ Cache is fresh when needed
✔ But database isn’t hammered constantly


🟢 Step 5 — Using Redis in Node.js Too

The Node.js service generated dashboard statistics.

Originally it did:

const stats = await getDashboardStatsFromDb(); res.json(stats);

So I added Redis using ioredis:

const Redis = require("ioredis"); const redis = new Redis(); app.get("/dashboard", async (req, res) => { const cached = await redis.get("dashboard_stats"); if (cached) { return res.json(JSON.parse(cached)); } const stats = await getDashboardStatsFromDb(); await redis.set("dashboard_stats", JSON.stringify(stats), "EX", 300); res.json(stats); });

Now dashboards load instantly.


🧠 Step 6 — Avoid Common Redis Mistakes

I’ve seen developers misuse Redis and blame Redis later.

So here’s what we avoided:

❌ Do NOT cache sensitive personal data

Tokens, passwords, etc.

❌ Do NOT cache huge blobs

Redis is memory-based.

❌ Do NOT cache things that change every second

This only adds overhead.

❌ Do NOT forget expiry

Always set TTL.


🔐 Bonus — Redis as a Rate Limiter

We also used Redis to prevent abuse.

Example in Node.js:

const key = `rate:${req.ip}`; const count = await redis.incr(key); if (count === 1) redis.expire(key, 60); if (count > 100) return res.status(429).send("Too many requests");

Meaning:

✔ Max 100 requests per minute per IP
✔ Stops bots
✔ Protects backend
✔ Super lightweight

Redis is perfect for this use-case.


🎯 Final Results (Real-World Impact)

After proper Redis integration:

🚀 Database load reduced by 80%
🚀 Average response time improved by 3–5x
🚀 CPU load on DB server dropped significantly
🚀 Fewer timeouts
🚀 Users reported faster experience

And the best part?

We changed very little code.

Caching is leverage.


📌 Key Takeaways

If you remember nothing else, remember this:

✅ Cache what is read often

✅ Cache what doesn’t change often

✅ Always set expiration

✅ Clear cache when updating

✅ Monitor performance

Redis isn’t magic, but when used properly…

…it feels like magic.

Advertisement
Back to Webdev