The WeaveVM Gateway Stack: Fast, Reliable Access to WeaveVM Data

The WeaveVM Gateway Stack: Fast, Reliable Access to WeaveVM Data

February 19, 2025

All storage chains have the same issue: even if the data storage is decentralized, retrieval is handled by a centralized gateway. A solution to this problem is just to provide a way for anyone to easily run their own gateway – and if you’re an application building on WeaveVM, that’s a great way to ensure content is rapidly retrievable from the blockchain.

When relic.bot – a photo sharing dApp that uses WeaveVM bundles for storage – started getting traction, the default WeaveVM gateway became a bottleneck for the Relic team. The way data is stored inside bundles (hex-encoded, serialized, compressed) can make it resource-intensive to decode and present media on demand, especially when thousands of users are doing so in parallel.

In response, we developed two new open source gateways: one JavaScript-based cache-enabled gateway, and one written in Rust.

The WeaveVM Gateway Stack introduces a powerful new way to access data from WeaveVM bundles, combining high performance with network resilience. At its core, it’s designed to make bundle data instantly accessible while contributing to the overall health and decentralization of the WeaveVM network.

The first two gateways are live at gateway.wvm.rs and resolver.bot. (This image below comparing response time is served from the resolver.bot cache).

comparison

Retrieval time in seconds. The legacy gateway.wvm.dev underperforms gateways built on the new stack, especially with caching enabled.

Why we built the WeaveVM gateway stack

The gateway stack solves several critical needs in the WeaveVM ecosystem:

Rapid data retrieval

Through local caching with SQLite, the gateway dramatically reduces load times (4-5x) for frequently accessed bundled data. No more waiting for remote data fetches – popular content is served instantly from the gateway node.

For relic.bot, this slashed feed loading times from 6-8 seconds to near-instant.

Network health

By making it easy to run your own gateway, the stack promotes a more decentralized network. Each gateway instance contributes to network redundancy, ensuring data remains accessible even if some nodes go offline.

Running a WeaveVM gateway

Running your own WeaveVM gateway is pretty straightforward. The gateway stack is designed for easy deployment, directly to your server or inside a Docker container.

With Docker, you can have a gateway up and running in minutes:

git clone https://github.com/weavevm/bundles-gateway.git  
cd bundles-gateway  
docker compose up -d

For rustaceans, rusty-gateway is deployable on a Rust host like shuttle.dev – get the repo here and Shuttle deployment docs here.

The technical side

Under the hood, the gateway stack features:

  • SQLite-backed persistent cache
    • Content-aware caching with automatic MIME type detection
    • Configurable cache sizes and retention policies
    • Application-specific cache management
    • Automatic cache cleanup based on age and size limits
    • Health monitoring and statistics

The gateway exposes a simple API for accessing bundle data:

GET /bundle/:txHash/:index

This endpoint handles the job of data retrieval, caching, and content-type detection behind the scenes.

Towards scalability

The WeaveVM gateway stack was built in response to problems of scale – great problems to have as a new network gaining traction. WeaveVM bundle data is now more accessible, resilient and performant. By running a gateway, you’re not just improving your own access to WeaveVM data – you’re contributing to a more robust, decentralized network.

Whether you’re building applications on WeaveVM or simply interested in supporting the network, learn how to deploy a gateway in a few minutes here.

Test the new gateways: