Rebuilding my Auction Website - Some Nodejs questions regarding websockets

I have a auction website that works great with php and scheduler. The sheduler i guess works the same as polling.
This has served me great and still doing the job. BUT i want to level up and provide a better platform.

My question would be - Websockets and Nodejs.
Will websockets handle 300+ bidders (It will ever harly get over 500) in real-time as the same time with ease? On one server.
The data (payloads) it needs to update would be very small eg... bidding price.. and a few others maybe 1 - 12kb
Do you require load balancers at a certain amout of people?

Would would be the best approach be with Websockets (* my description might not make a 100% sense)

  • Create rooms - so per auction id - isolate data per auction and avoid broadcasting to everyone unnecessarily
  • Do i need to use redis for future scaling
  • Because if i have 5000 items from previous auctions i dont want to "query" all of those, and only "query" the data for that auction id
  • So would one by able to "group" the websocket data?
  • Im just looking at this Realtime Data Update with NodeJS and WebSockets (bus only see emit - is there more functions)
  • But refreshing all off the data could then cause a bottleneck?
  • And then only emit to that room only?? Does wappler support rooms or do one need to use a different approach with an auction.

Please can anyone help me with some sort of a plan on how to approach this, as i have no knowledge of websockets...

As you @Teodor are the expert on this.. i would really value your input in this matter - as your head works in a way I can only dream off.. thank you so much and looking forward to everybody's input.

As no one has answered as yet and not having the knowledge myself, I decide to ask Copilot with this answer:

You're thinking in the right direction! WebSockets with Node.js can absolutely handle 300–500 bidders in real-time, especially since your payloads are small (1–12KB). However, there are a few considerations to ensure smooth performance:

Key Considerations for WebSockets in Auctions

  1. Single Server vs. Load Balancing

    • A single server can handle hundreds of connections if optimized properly.
    • If you scale beyond 500+ concurrent bidders, a load balancer (like Nginx or HAProxy) may be needed to distribute connections.
  2. Using WebSocket Rooms for Auctions

    • Creating rooms per auction ID is a great approach!
    • This prevents unnecessary broadcasting and ensures only relevant bidders receive updates.
    • Libraries like Socket.io support rooms natively.
  3. Redis for Scaling & Efficient Querying

    • Redis is useful for storing active auction data and reducing database queries.
    • Instead of querying all 5000 past auctions, store only active auctions in Redis.
    • You can use Redis Pub/Sub to sync WebSocket messages across multiple servers.
  4. Avoiding Bottlenecks in Data Refreshing

    • Instead of refreshing all auction data, only emit updates for the relevant auction room.
    • Use event-driven updates rather than polling.
  5. Does Wappler Support WebSocket Rooms?

    • Wappler supports WebSockets, but room-based grouping may require manual implementation.
    • You might need Socket.io or a custom WebSocket server for better control.

Recommended Approach

  • Use Node.js + WebSockets (Socket.io) for real-time bidding.
  • Implement auction rooms to isolate data per auction.
  • Store active auctions in Redis for fast access.
  • Use event-driven updates instead of polling.
  • Consider load balancing if scaling beyond 500 bidders.

Would you like a sample WebSocket implementation to get started? :rocket:

Yes!

Great! Here’s a basic WebSocket implementation for an auction system using Node.js and Socket.io:

1. Install Dependencies

Run this in your terminal:

npm install express socket.io

2. Create a WebSocket Server

const express = require('express');
const http = require('http');
const socketIo = require('socket.io');

const app = express();
const server = http.createServer(app);
const io = socketIo(server);

io.on('connection', (socket) => {
    console.log('A user connected');

    socket.on('joinAuction', (auctionId) => {
        socket.join(auctionId);
        console.log(`User joined auction room: ${auctionId}`);
    });

    socket.on('placeBid', ({ auctionId, bidAmount }) => {
        io.to(auctionId).emit('newBid', { bidAmount });
        console.log(`New bid in auction ${auctionId}: $${bidAmount}`);
    });

    socket.on('disconnect', () => {
        console.log('User disconnected');
    });
});

server.listen(3000, () => {
    console.log('WebSocket server running on port 3000');
});

3. Client-Side WebSocket Connection

const socket = io('http://localhost:3000');

socket.emit('joinAuction', 'auction123'); // Join a specific auction room

socket.on('newBid', (data) => {
    console.log(`New bid received: $${data.bidAmount}`);
});

function placeBid(amount) {
    socket.emit('placeBid', { auctionId: 'auction123', bidAmount: amount });
}

Key Features

:white_check_mark: Rooms per auction to isolate bidding data
:white_check_mark: Real-time bid updates only for relevant users
:white_check_mark: Efficient broadcasting using io.to(room).emit()

Would you like to integrate Redis for scaling or explore load balancing strategies? :rocket:
You can also check out this GitHub repo for more WebSocket examples!

1 Like

thanks for that Ben. I would like to stay away from "manually" coding anything.. especially the * Implement auction rooms to isolate data per auction. And hoping that one can do this directly in Wapplers interface / ui.

but at least i guess it answers 99% of the questions :slight_smile:

I'm betting Wappler's implementation of WebSockets is somewhat different from the AI suggestions above. I haven't worked with Wappler WebSockets, but from what I've read I get the feeling the WebSockets exist just to notify updates, and then you run a Server Connect to fetch the changes. No idea about the per-room notify

1 Like

Have an app in mind myself but been concerned with how to implement and time has kept me away for this. Here's a little something that might help:

https://docs.wappler.io/t/websockets-realtime-messaging/30847

https://docs.wappler.io/ key word: sockets provides 4 articles that should give you solid

1 Like

Hi,
We use sockets at a scale above your headline requirements without issue.
It can support everything you've outlined in your requirements without effort. And the socket payload can be tailored and extensive - and room/auction based without any issues.

Given your numbers, I'd recommend including ValKey (Redis replacement) as your session/cache. It will save your DB and improve performance.

for a basic and low-cost set-up, I'd suggest a basic load balancer with a couple of app servers (2x $20 droplets). Total cost of a fully-managed service from DigitalOcean circa $100+ pcm and saves you thinking about the ops issues too much.

John

2 Likes

Meanwhile in Teodor's head:
homer-pensando

3 Likes

mmm sounds interesting... and thanks for "confirming" your use case and numbers... its great to know.. Ill have to look into ValKey but it might be above my paygrade .. but ill persevere ....

I will also not be able to go with DigitalOcean.. :frowning: :frowning: latency to South Africa is madness... Our only "option" here for proper scaling is AWS... eish....

Thanks for that reply... i will just have to build out small modules for testing.. and then in theory if they work.. i can scale it out... that is according to my head.. hahaha

Just one more question @scalaris .. did you have to do lots of custom coding or could you do most of the websocket work via Wapplers tools?

You can do this without any coding involved!

There isn't much information available, but the one that is, should point you in the right direction! :slight_smile:

As mentioned by @Apple You simply send message of the update and on success trigger websocket update! This way, only users within the scope receive the message and get an update.

In terms of servers, I am not familiar with datacenter providers in South Africa, but you can always try Coolify. I was looking to make a guide of how to launch your application via simple CI/CD pipeline from gitea or github repo, or image.

Also, you can launch dragonfly, redis alternative that is much faster with a single click of the button!

You can also launch your own docker swarm in server settings, by simply going through their tutorial!

Also, please bear in mind, if you're diving into the world of node, with multiple instances, via load balancer, if you have any scheduled cron actions within wappler, they WILL trigger in each of the instance.

So if you have let's say 3 servers running. each of them are exact same, just simply under the same loadbalancer. and you have scheduled actions that run every day at 2PM that announce today's auctions to your subscribers, etc. You will have the same email sent 3 times. The easiest way to handle this, if let's say you used Coolify, you simply need to add environmental variable that specifies it's a scheduler node, and applying condition to your scheduled action, only that particular server will trigger your scheduled actions.

I guess it's not node.js specific issue, as the exact same thing could happen on any application running cron jobs, but this is just an observation if you're running multiple servers under loadbalancer! :slight_smile:

1 Like

@Evaldas do you use Coolify for Wappler? Wappler makes it really hard :laughing:

P.S.: @Mozzi you don't have to spend time learning about Valkey or Dragonfly, you're more than welcome to stick with Redis... Those decisions won't impact at 500 concurrent users...

1 Like

Haha Indeed I do! Wait, by meaning that wappler makign it very hard oyu mean that it has all the tools out of the box? :smiley: Don't get me wrong, wappler provides all the tools you need to publish it yourself, but once you handle the database migrations yourself, you simply forget about the database manager. It's great, but feature lacking. Once you automate your deployments, there's not need for single instance deployments eirher :man_shrugging: :smiley:

I'll continue my Coolify talk with @Evaldas in private to not derail this topic. Eventually, once stuff is ironed out a topic could be created

Edit: It was a short talk:

How do you deploy in Coolify? Git? And how do you configure the DB credentials? You get from environment variables?

Yup, simple git deployment. You can also automatically deploy via registry too! In coolify, you can simply set your variables per environment.,so it's super easy to seperate dev and prod databases etc.