Content Caching for Node.js with Redis


Now you can empower Redis caching service with Docker and select which Node.js pages and API Actions to be cached. Then instead of processing them fully on each request, they will be cached in the super fast Redis memory cache and served so much faster to the clients!
This is the perfect Server Side caching for all your web visitors! This will increase your site speed at least 10 times!

Note: All this is possible only with Node.js and Docker and it is for now only under the Experimental Features of Wappler. So make sure you enable those first.

Enable Caching

First we need to enable Redis caching. Click the project settings button:

Open Targets and select your target. Scroll down the target options and Enable Redis:

There are two options available - Persistent, which saves a snapshot of the cached data from the memory to the disk every N seconds. This option is what you will need in most of the cases.
Append Only option is only needed in some cases where you need full durability - it saves the data constantly.

You can learn more about these two options here:

Click Save when you are done:

The server side caching is now enabled for our docker target.

Caching Node.js pages

Server side caching of pages is useful for pages, which HTML does not change often. So on every refresh your page HTML will be loaded from the server cache - which makes rendering of the page much faster!

In order to enable caching for a Node.js page, open the Routing Panel:

Select your page route and you will see the Cache Time option in the properties options for this route:

Enter the time in seconds, this page should be cached. We enter 604800 which equals 7 days:

And you are done. Your page will be cached and served from the server cache for all the clients loading it for 7 days.
Don’t forget to Deploy your project to the selected target!

Caching API Actions

You can also cache API Actions, which data does not change that often. This way you won’t make request to the database to get the data and display it to the users.

First open the Server Connect Panel and select your API Action:

Click Settings, and you will see the Cache Time option in the Properties Panel:

Enter the amount of time in seconds that this API Action should be cached and click the Save button:

And you are done!
Your users will load the API Action data from the server cache in the next 3600 seconds.


@Teodor - this looks brilliant, especially the ability to run multiple node servers.

Quick novice question, for this to function, do we need to have our DB within the Docker environment? Or will it work with a remote DB + docker handling the application?

NodeJS scaling up is just for the docker node instances and it is db independent indeed, so you can have them connect all to the same database as you do now.

@George amazing feature :slight_smile: will this only work if I use Mysql? I am running a test project and am using mariadb and I can’t see the redis option in the project setup and experimental features is switched on. Perhaps I need to set it up in my docker container desktop app?
thanks in advance

Redis is actually database independent. You just need to enable it in the docker target options and then use it as caching.

Just make sure experimental options are on indeed.

I want to cache an API Action step (request to a third-party API), not the whole Server Action workflow. Is this possible?

I would suggest creating a Cache block, where you could then place the API Action (or other steps) inside it. All return variables inside such block would be put in an Object and thrown to Redis. For next runs, the Cache block would pull the Object from Redis instead of running all inside steps. I would also recommend to add “Cache Time” and “Erase cache” button to the Cache block properties/options

@George @Teodor


I haven’t tried this but maybe as a workaround you can add the API call to a library an cache just that?

Edit: nope because it’s the whole route that is being cached, right?

Is Redis caching still limited to Docker deployments? I’ve looked through the forum and can’t find a definitive answer other than this, so I want to confirm if this is still up to date.

Redis is not only for docker developments. You can use your own installed Redis or external one. Just enter its address in the server connection options.

Docker in general is just used for quick and easy installation of services but if you already have them you can used your own.

1 Like

Hello @George! It’s possible to connect externally to the wappler-made redis container in docker? I mean, it has no exposed ports and I know it’s convienently possible to connect to it via REDIS-CLI and REDIS LOGS in wappler ui.

What I’m trying to do is to use Redis-Commander to connect to the Redis docker container, but it seems to be impossible due to the not-exposed ports.

I already use redis-commander to successfully connect to another redis container I deployed for other stuff in the same Docker CE server.


Currently the Redis is indeed for internal use only and doesn’t expose any external access only through the Redis CLI

In our upcoming Server Manager in Wappler 5, we will be adding much more control to the docker servers and the services they run. We will also be adding more global docker services that you can use in multiple projects.


That’s great to hear, thanks so much for the W5 update anticipation :slight_smile:

i hope they add it on wappler 5 too.