Blog/Behind the Code/Scaling a Multi-Vendor E-Commerce Platform with Next.js
POST
May 22, 2025
LAST UPDATEDMay 22, 2025

Scaling a Multi-Vendor E-Commerce Platform with Next.js

Case study on scaling a multi-vendor e-commerce platform to handle 50+ vendors and 10K+ products using Next.js ISR, edge caching, and a microservices backend with NestJS.

Tags

E-CommerceNext.jsScalingFull Stack
Scaling a Multi-Vendor E-Commerce Platform with Next.js
10 min read

Scaling a Multi-Vendor E-Commerce Platform with Next.js

TL;DR

Incremental Static Regeneration combined with edge caching and a microservices backend let us scale a multi-vendor marketplace to thousands of products while keeping page loads fast. We used a polyglot database strategy — MySQL for transactional data, MongoDB for product catalogs — and built a NestJS backend with domain-separated services for vendors, inventory, orders, and payments.

The Challenge

The client was building a multi-vendor marketplace where independent sellers could list products, manage inventory, and fulfill orders through a shared storefront. Think of a regional Etsy — multiple vendors, each with their own branding and product catalog, selling through a unified shopping experience.

The technical requirements were demanding. Product pages needed to load fast for SEO and conversion. Inventory had to be accurate across vendors to prevent overselling. The checkout flow needed to split orders across multiple vendors, each potentially using a different fulfillment method. Payment splitting — where a single customer payment got distributed to multiple vendors minus platform fees — added financial complexity. And the platform needed to handle traffic spikes during promotional events without degrading performance.

The existing prototype was a Create React App frontend with a monolithic Express backend. Product pages were client-side rendered, which meant terrible SEO scores and slow initial loads. The single Express server handled everything from authentication to payment processing in one codebase. The database was a single MySQL instance with increasingly complex queries that joined vendor, product, order, and payment tables.

We needed to rebuild the frontend for performance, decompose the backend for scalability, and redesign the data layer for the multi-vendor domain.

The Architecture

Next.js Rendering Strategy

Not every page needed the same rendering strategy. We mapped each route type to the approach that best fit its access pattern and freshness requirements.

Product listing pages (category pages, search results) used ISR with on-demand revalidation. These pages changed when vendors added or updated products, which happened throughout the day but not in real time. ISR let us serve them as static HTML while revalidating in the background.

tsx
// app/products/[category]/page.tsx
import { getProductsByCategory } from '@/lib/api/products';
import { ProductGrid } from '@/components/ProductGrid';
 
export const revalidate = 300; // Revalidate every 5 minutes
 
export async function generateStaticParams() {
  const categories = await getTopCategories();
  return categories.map((cat) => ({ category: cat.slug }));
}
 
export default async function CategoryPage({
  params,
}: {
  params: { category: string };
}) {
  const products = await getProductsByCategory(params.category);
 
  return (
    <main>
      <h1>{products.categoryName}</h1>
      <ProductGrid products={products.items} />
    </main>
  );
}

Individual product pages used ISR with shorter revalidation plus client-side inventory checks. The static page showed product details, images, and description. A client component fetched real-time stock availability on mount:

tsx
// app/products/[category]/[slug]/page.tsx
import { getProduct } from '@/lib/api/products';
import { ProductDetail } from '@/components/ProductDetail';
import { InventoryChecker } from '@/components/InventoryChecker';
 
export const revalidate = 60; // More aggressive for product pages
 
export default async function ProductPage({
  params,
}: {
  params: { slug: string };
}) {
  const product = await getProduct(params.slug);
 
  return (
    <main>
      <ProductDetail product={product} />
      <InventoryChecker
        productId={product.id}
        variants={product.variants}
      />
    </main>
  );
}
tsx
// components/InventoryChecker.tsx
'use client';
 
import { useEffect, useState } from 'react';
import { checkInventory } from '@/lib/api/inventory';
 
export function InventoryChecker({
  productId,
  variants,
}: {
  productId: string;
  variants: Variant[];
}) {
  const [stock, setStock] = useState<Record<string, number>>({});
 
  useEffect(() => {
    checkInventory(productId).then(setStock);
  }, [productId]);
 
  return (
    <div>
      {variants.map((v) => (
        <div key={v.id}>
          {v.name}: {stock[v.id] !== undefined
            ? stock[v.id] > 0
              ? `${stock[v.id]} in stock`
              : 'Out of stock'
            : 'Checking...'}
        </div>
      ))}
    </div>
  );
}

Cart and checkout pages were fully dynamic with SSR. These were personalized, contained sensitive data, and needed real-time accuracy. No caching.

The vendor dashboard was a client-side rendered section behind authentication. SEO didn't matter here, and the data was highly interactive (inventory management, order processing, analytics charts). We used React Query for data fetching and optimistic updates.

Polyglot Database Design

A single database couldn't efficiently serve both the transactional and catalog needs of a multi-vendor marketplace. We split across two database engines.

MySQL handled transactional data: orders, payments, vendor accounts, user accounts, and inventory counts. These required ACID transactions, especially during checkout when we needed to atomically decrement inventory, create an order, and record payment information.

MongoDB stored the product catalog. Product documents varied significantly across vendors and categories. A clothing vendor's product had size and color attributes. An electronics vendor had specifications like voltage and wattage. MongoDB's flexible schema handled this naturally without requiring an ever-growing ALTER TABLE migration.

ts
// services/product-catalog/src/schemas/product.schema.ts
import { Prop, Schema, SchemaFactory } from '@nestjs/mongoose';
import { Document } from 'mongoose';
 
@Schema({ timestamps: true, collection: 'products' })
export class Product extends Document {
  @Prop({ required: true, index: true })
  vendorId: string;
 
  @Prop({ required: true })
  title: string;
 
  @Prop({ required: true })
  slug: string;
 
  @Prop({ required: true })
  categoryId: string;
 
  @Prop({ type: Object })
  description: {
    short: string;
    full: string;
    html: string;
  };
 
  @Prop({ type: [Object] })
  images: {
    url: string;
    alt: string;
    position: number;
  }[];
 
  @Prop({ type: [Object] })
  variants: {
    id: string;
    sku: string;
    name: string;
    price: number;
    compareAtPrice?: number;
    attributes: Record<string, string>;
  }[];
 
  @Prop({ type: Object })
  attributes: Record<string, unknown>; // Flexible per-category attributes
 
  @Prop({ type: Object })
  seo: {
    title: string;
    description: string;
    keywords: string[];
  };
 
  @Prop({ default: true })
  isActive: boolean;
}
 
export const ProductSchema = SchemaFactory.createForClass(Product);
ProductSchema.index({ slug: 1 }, { unique: true });
ProductSchema.index({ vendorId: 1, isActive: 1 });
ProductSchema.index({ categoryId: 1, isActive: 1 });
ProductSchema.index({ '$**': 'text' }); // Full-text search

The inventory count lived in MySQL, not MongoDB, even though it was conceptually related to products. This was deliberate — inventory changes needed transactional guarantees (decrement-and-check atomicity) that MongoDB's document-level locking didn't provide as cleanly as MySQL row-level locks.

Order Splitting and Payment Distribution

A single cart could contain products from multiple vendors. At checkout, we split the cart into sub-orders per vendor and processed payment as a single charge to the customer, then distributed funds.

ts
// services/order/src/order.service.ts
import { Injectable } from '@nestjs/common';
import { InjectRepository } from '@nestjs/typeorm';
import { Repository, DataSource } from 'typeorm';
import { Order } from './entities/order.entity';
import { SubOrder } from './entities/sub-order.entity';
 
@Injectable()
export class OrderService {
  constructor(
    private dataSource: DataSource,
    @InjectRepository(Order)
    private orderRepo: Repository<Order>,
  ) {}
 
  async createOrder(cart: CartDTO, customerId: string): Promise<Order> {
    const queryRunner = this.dataSource.createQueryRunner();
    await queryRunner.connect();
    await queryRunner.startTransaction();
 
    try {
      // Group cart items by vendor
      const vendorGroups = this.groupByVendor(cart.items);
 
      // Create parent order
      const order = queryRunner.manager.create(Order, {
        customerId,
        status: 'pending',
        totalAmount: cart.totalAmount,
        currency: cart.currency,
      });
      await queryRunner.manager.save(order);
 
      // Create sub-orders per vendor
      for (const [vendorId, items] of Object.entries(vendorGroups)) {
        const subtotal = items.reduce((sum, item) => sum + item.price * item.quantity, 0);
        const platformFee = subtotal * 0.12; // 12% platform commission
 
        const subOrder = queryRunner.manager.create(SubOrder, {
          orderId: order.id,
          vendorId,
          items: JSON.stringify(items),
          subtotal,
          platformFee,
          vendorPayout: subtotal - platformFee,
          status: 'pending',
        });
        await queryRunner.manager.save(subOrder);
 
        // Decrement inventory atomically
        for (const item of items) {
          const result = await queryRunner.query(
            `UPDATE inventory
             SET quantity = quantity - ?
             WHERE variant_id = ? AND quantity >= ?`,
            [item.quantity, item.variantId, item.quantity]
          );
 
          if (result.affectedRows === 0) {
            throw new Error(`Insufficient stock for variant ${item.variantId}`);
          }
        }
      }
 
      await queryRunner.commitTransaction();
      return order;
    } catch (error) {
      await queryRunner.rollbackTransaction();
      throw error;
    } finally {
      await queryRunner.release();
    }
  }
 
  private groupByVendor(items: CartItem[]): Record<string, CartItem[]> {
    return items.reduce((groups, item) => {
      const group = groups[item.vendorId] || [];
      group.push(item);
      groups[item.vendorId] = group;
      return groups;
    }, {} as Record<string, CartItem[]>);
  }
}

The inventory decrement used a conditional UPDATE rather than a SELECT-then-UPDATE pattern. The WHERE quantity >= ? clause acted as an optimistic lock — if concurrent checkouts raced for the same product, one would succeed and the other would get affectedRows === 0, triggering a rollback. No separate locking mechanism needed.

Caching Strategy

We layered caching at multiple levels to handle traffic spikes without hitting the database for every request.

CDN edge caching for ISR pages. Next.js handled this natively — ISR pages were served from the CDN edge with stale-while-revalidate semantics.

Redis application cache for hot data. Product details, category trees, and vendor profiles were cached in Redis with TTLs matched to their ISR revalidation intervals. API routes checked Redis first, fell back to the database, and populated the cache on miss.

ts
// lib/cache/productCache.ts
import { Redis } from 'ioredis';
 
const redis = new Redis(process.env.REDIS_URL!);
const PRODUCT_TTL = 60; // Match ISR revalidation
 
export async function getCachedProduct(slug: string): Promise<Product | null> {
  const cached = await redis.get(`product:${slug}`);
  if (cached) return JSON.parse(cached);
 
  const product = await fetchProductFromDB(slug);
  if (product) {
    await redis.setex(`product:${slug}`, PRODUCT_TTL, JSON.stringify(product));
  }
  return product;
}
 
export async function invalidateProduct(slug: string): Promise<void> {
  await redis.del(`product:${slug}`);
  // Trigger ISR on-demand revalidation
  await fetch(`${process.env.NEXT_URL}/api/revalidate?path=/products/${slug}`, {
    headers: { 'x-revalidate-token': process.env.REVALIDATE_SECRET! },
  });
}

No caching for inventory checks, cart operations, and checkout. These hit the database directly because stale data here meant overselling or incorrect prices.

Key Decisions & Trade-offs

ISR over full SSR. Full SSR would have given us perfectly fresh data on every request but at the cost of server load and response time. With thousands of product pages, SSR would have required significant server capacity during traffic spikes. ISR gave us CDN-speed delivery with acceptable staleness (60 seconds for product pages, 5 minutes for category pages). The client-side inventory check covered the one piece of data that couldn't tolerate staleness.

MySQL + MongoDB over PostgreSQL with JSONB. PostgreSQL's JSONB column type could have handled the flexible product attributes without needing a second database. We chose the polyglot approach because the query patterns were fundamentally different. Product catalog queries were mostly reads with flexible filtering — MongoDB's aggregation pipeline handled this naturally. Transactional operations needed joins, foreign keys, and multi-table transactions — MySQL's strengths. Running both added operational overhead, but each database operated in its sweet spot.

Sub-orders over single orders. Splitting a purchase into sub-orders per vendor added complexity to the order model but simplified everything downstream. Each vendor saw only their sub-orders in the dashboard. Fulfillment tracking was per sub-order, so one vendor's shipping delay didn't block another vendor's shipment. Payment distribution mapped cleanly to sub-orders. The alternative — a single order with multi-vendor items — would have required complex partitioning logic in every downstream system.

Platform-managed payments over direct vendor payments. We collected payment as the platform and distributed to vendors, rather than using Stripe Connect's direct charge model. This gave the platform more control over refunds, disputes, and vendor payouts. The tradeoff was that the platform held funds temporarily, which required compliance considerations and clear payout schedules.

Optimistic inventory locking over pessimistic locks. Pessimistic locks (SELECT FOR UPDATE) would have prevented overselling more aggressively but created lock contention during high-traffic events. Our conditional UPDATE approach was lock-free — concurrent writes didn't block each other. The worst case was a failed checkout that the user could retry, which was a better UX than a checkout that hung waiting for a lock to release.

Results & Outcomes

Page load performance improved dramatically compared to the original CRA-based prototype. Product pages served from the CDN edge loaded in a fraction of the time the client-rendered version took. Core Web Vitals scores moved from poor to good across the board, which directly impacted search ranking.

The vendor onboarding experience improved because the flexible product schema didn't force vendors into rigid attribute structures. A jewelry vendor could add "material" and "gemstone" attributes while a food vendor added "allergens" and "shelf life." This flexibility reduced support requests from vendors struggling to list their products.

The platform handled promotional traffic spikes without degradation. During a launch event with several vendors running simultaneous promotions, the CDN absorbed the read traffic while the backend only handled inventory checks and checkouts. The database never became a bottleneck because the hot path (browsing products) was served from cache.

Order accuracy improved significantly. The atomic inventory decrement eliminated overselling incidents that had plagued the prototype. When stock ran out, the checkout flow caught it immediately with a clear error rather than accepting the order and failing during fulfillment.

The vendor dashboard gave sellers real-time visibility into their sales, inventory, and payouts. Several vendors reported that the dashboard alone saved them hours per week compared to the manual tracking they did before joining the platform.

What I'd Do Differently

I'd use Stripe Connect from the start rather than building custom payment distribution. Our platform-managed payment flow worked but required significant custom logic for handling refunds, partial refunds, and vendor payout schedules. Stripe Connect's built-in features for marketplace payments would have eliminated months of payment-related development and reduced our PCI compliance scope.

I'd implement event-driven inventory updates using a message queue (like RabbitMQ or AWS SQS) between the product catalog and the inventory service. Our synchronous approach worked but created tight coupling between services. An event-driven model would have made the system more resilient and easier to extend with features like low-stock notifications, auto-reorder triggers, and inventory analytics.

I'd also evaluate edge-side personalization using middleware. Our category pages were the same for everyone, but adding personalized product recommendations or region-specific pricing at the edge (using Next.js Middleware) could have improved conversion without sacrificing the caching benefits of ISR.

FAQ

How does ISR help with e-commerce at scale?

ISR regenerates product pages in the background on a time-based or on-demand schedule. This means pages are served as static HTML for fast loads, but product data like prices and stock levels stay fresh without requiring full rebuilds of thousands of pages. In our marketplace, product pages revalidated every 60 seconds and category pages every 5 minutes. When a vendor updated a product through the dashboard, we also triggered on-demand revalidation so the change appeared quickly. The key insight is that most product data — descriptions, images, specifications — changes infrequently, so serving slightly stale static pages is acceptable. For the one piece of data that must be real-time (inventory), we layered a client-side check on top of the static page. This hybrid approach gave us CDN-speed page loads for thousands of product pages without a powerful server fleet.

How do you handle inventory across multiple vendors?

Each vendor manages their own inventory through a vendor portal. A centralized inventory service aggregates stock levels and uses optimistic locking with version checks to prevent overselling during concurrent purchases across the shared marketplace. Specifically, inventory lived in MySQL with a conditional UPDATE query: UPDATE inventory SET quantity = quantity - ? WHERE variant_id = ? AND quantity >= ?. The WHERE quantity >= ? clause ensured that if two customers bought the last unit simultaneously, one checkout succeeded and the other failed gracefully with an "out of stock" message. No row-level locks were held, so concurrent checkouts for different products weren't blocked. Vendors saw real-time inventory in their dashboard and received low-stock alerts. The inventory service also exposed a public API endpoint that the Next.js frontend called client-side to show real-time stock availability on product pages, ensuring customers always saw accurate availability even when the ISR-rendered page content was a few seconds old.

What database architecture works for multi-vendor e-commerce?

We used a polyglot approach: MySQL for transactional data (orders, payments, inventory, accounts) and MongoDB for the product catalog. MySQL provided ACID transactions essential for checkout flows where inventory decrements, order creation, and payment records needed to succeed or fail atomically. MongoDB's flexible document model handled the varying product structures across vendor categories without schema migrations — a clothing vendor's product document had different attributes than an electronics vendor's, and both coexisted naturally. Redis served as the caching layer for hot product data and session management. Each vendor's data was logically separated using vendor IDs rather than physical database separation, keeping operational overhead manageable while maintaining data isolation through application-level access controls and query scoping. This architecture let each database engine handle its strengths: MySQL for consistency-critical writes, MongoDB for flexible reads, and Redis for speed.

Collaboration

Need help with a project?

Let's Build It

I help startups and established companies design, build, and scale world-class digital products. From deep technical architecture to pixel-perfect UI — let's bring your vision to life.

SH

Article Author

Sadam Hussain

Senior Full Stack Developer

Senior Full Stack Developer with over 7 years of experience building React, Next.js, Node.js, TypeScript, and AI-powered web platforms.

Related Articles

Optimizing Core Web Vitals for e-Commerce
Mar 01, 202610 min read
SEO
Performance
Next.js

Optimizing Core Web Vitals for e-Commerce

Our journey to scoring 100 on Google PageSpeed Insights for a major Shopify-backed e-commerce platform.

Building an AI-Powered Interview Feedback System
Feb 22, 20269 min read
AI
LLM
Feedback

Building an AI-Powered Interview Feedback System

How we built an AI-powered system that analyzes mock interview recordings and generates structured feedback on communication, technical accuracy, and problem-solving approach using LLMs.

Migrating from Pages to App Router
Feb 15, 20268 min read
Next.js
Migration
Case Study

Migrating from Pages to App Router

A detailed post-mortem on migrating a massive enterprise dashboard from Next.js Pages Router to the App Router.