Have any questions:

Toll free:9801887718Available 24/7

Email our experts:info@mantraideas.com

In: Sports Tech, Web Development

The year 2025 was special for us at Mantra Ideas. The long-awaited Nepal Premier League Auction was finally happening, and we were given the responsibility to build the real-time auction system that thousands of sports enthusiasts, franchise owners, and players would be watching closely.

At first glance, it sounds straightforward — build a system to handle bids, update prices, and show information live. But in reality, it was a tightrope walk of technology and precision, under the watchful eyes of managers, broadcasters, and fans. This was not just a website project; it was a real-time broadcast application, where every millisecond mattered.

We built everything on ReactJS, ExpressJS, WebSockets, and MySQL, designed unique device-specific dashboards, ran the entire system locally (not in the cloud) to prevent delays, and even powered the footer ticker display during live TV broadcasts through port forwarding.

Let’s go behind the scenes.

Why MySQL and Not MongoDB

At the start, we considered MongoDB for its flexibility, but after careful thought, MySQL became our final choice.

Why? Because:

  1. Relational Dependence: Player information, purse values, and bidding histories all had strong relational dependencies. For example, linking a player’s bid history directly to team purse updates worked elegantly with SQL tables.
  2. Transaction Safety: In an auction, data integrity is paramount. If two teams placed a bid at the same time, we couldn’t afford shaky writes. MySQL transactional queries gave us guaranteed consistency.
  3. Performance: Since we were running the entire system locally on secured, power-optimized machines, MySQL’s lightweight deployment perfectly fit our requirement.

Instead of building for global scalability, we built for speed, stability, and zero lag in a closed environment.

Why Everything Was Done Locally

Unlike typical SaaS apps, our platform was never deployed to the cloud. Instead, we ran it completely on local servers inside the event venue.

  • This decision was made to minimize latency. Every time a team pressed the “Bid” button, we wanted the response instant, visible across screens without depending on external internet routing.
  • Running locally also gave us full control if internet connectivity wavered — which is not uncommon in large halls and event centers in Nepal.
  • Backups and scripts were run on-location to ensure nothing depended on cloud sync.

It was almost like hosting a private, closed network “mini internet” dedicated just for the Nepal Premier League Auction.

The Multi-Layered Interfaces

Admin Dashboard (Control Room):
This was our command center. From here, organizers could:

  • Push out player details (profiles, base price, stats).
  • Track and adjust purse sizes.
  • Start and stop auction cycles.
  • Trigger “SOLD” announcements.

Franchise Tablets/iPads:
Each team got a tablet that showed:

  • Live player data as soon as the admin released it.
  • Their team budget (purse) shrinking in real-time when bids were placed.
  • Competitor team spending for contextual strategy.
  • A bid placement interface synced to the WebSocket server.

Player Display Screen (Stage):
The big public-facing screen showed:

  • Which player was currently being auctioned.
  • The respective team logo that was raising the bid.
  • A celebratory SOLD animation when a player was sold.

Everything synced seamlessly — admin → WebSocket server → MySQL → live displays.

One of the “hidden but crucial” tasks we handled was the footer ticker shown in the live TV broadcast.

  • This ticker displayed Player information and was sold or not on the broadcast.
  • To make this work, we used port forwarding from our system to the broadcasters’ mixing desk.
  • Whenever data was updated (like a new top bid), it wasn’t just sent to tablets or the stage screen but also pushed directly into the ticker software in real-time.

This gave TV/YouTube’s audiences the same thrilling, instant updates that franchise owners and in-venue viewers experienced.

WebSockets – The Real-Time Backbone

WebSockets tied everything together by:

  1. Broadcasting each bid instantly to all interfaces (admin, tablets, player screen, footer ticker).
  2. Handling reconnections seamlessly in case of local network drops.
  3. Allowing bi-directional communication — admins controlling flows, and franchises placing instant bids.

Unlike REST polling, which introduces delays, WebSockets gave us true live synchronization.

Last-Minute Challenges

Of course, there was chaos in the final 24 hours:

  • Players’ profile datasets along with their respective images were delivered late, requiring last-minute uploads.
  • Some purse sizes from franchises were adjusted by organizers just before the event.
  • Backups had to be run multiple times to ensure no critical auction data would be lost.

Thanks to the local MySQL setup and buffer writes, we managed these high-pressure updates without breaking the live system.

Spread the love

Leave a Reply

Your email address will not be published. Required fields are marked *