Understanding server connect and accumulative DB connections

Hi all,

I recently started running into some errors with too many DB connections.

So I was wondering how this works - ie if I have say 6 server connects on one page load - will this count as 6 dB connections?

Every database connection step in your server actions counts as a database connection.
The server action loaded on page load does not count as a database connection if it doesn’t contain any.

Can you share the exact error message you see and explain what do you have in your server actions that causes it and when do you see it?

Hi Teodor, I’ll try and recreate this this morning and share screenshots.

It’s just on my dev environment (both local and also remote), so there are no other users using it.

This perplexes me a little, one part of this app is a public page that is shareable, and could have 100’s, or thousands of visitors - each time they view the page it will call the DB for the page’s information - this seems unscalable? I wonder what the normal way to handle this would be?

@Teodor this is the error I received, 1 user, simply clicking around the pages, loading lists etc.

{“status”:“500”,“code”:“53300”,“message”:“too many connections for role “ddqdnqmojabwqz””,“stack”:“error: too many connections for role “ddqdnqmojabwqz”\n at Parser.parseErrorMessage (/var/www/vhosts/pitchflare.com/httpdocs/node_modules/pg-protocol/dist/parser.js:278:15)\n at Parser.handlePacket (/var/www/vhosts/pitchflare.com/httpdocs/node_modules/pg-protocol/dist/parser.js:126:29)\n at Parser.parse (/var/www/vhosts/pitchflare.com/httpdocs/node_modules/pg-protocol/dist/parser.js:39:38)\n at TLSSocket. (/var/www/vhosts/pitchflare.com/httpdocs/node_modules/pg-protocol/dist/index.js:8:42)\n at TLSSocket.emit (events.js:200:13)\n at addChunk (_stream_readable.js:294:12)\n at readableAddChunk (_stream_readable.js:275:11)\n at TLSSocket.Readable.push (_stream_readable.js:210:10)\n at TLSWrap.onStreamRead (internal/stream_base_commons.js:166:17)”}

Maybe @patrick can help here more.

This is a connection limit issue, Heroku / Postgres by default limits to 20/22 depending on the platform so I used the KillAll - which has fixed it.

If yourself or @Teodor, could help however and advise on this question:

“This perplexes me a little, one part of this app is a public page that is shareable, and could have 100’s, or thousands of visitors - each time they view the page it will call the DB for the page’s information - this seems unscalable? I wonder what the normal way to handle this would be?”

@Teodor so, if I have 5 server connects on one page, each with a DB Connection step - and one user loading that page, will that be ‘5’ new DB connections in terms of allocation, or just 1 new DB connection - but the user is connecting 5 times?

I see you use nodeJS and that uses the knex library for the database connections. Knex seems to use a connection pool for the connections (http://knexjs.org/#Installation-pooling) which defaults to min: 2 and max: 10. So I think that when you load 5 files at the same time each files uses minimal 2 connections, so that are 10 connections.

To optimize we could set the connection pool to use only 1 connection by settings the min and max to 1. You can do this in the connection json file. The other thing we could do is to reuse the same connection pool each time, but I’m not sure if that will cause any other problems.

So try limiting the pool by editing the connection file, it is located in app/modules/connections. The file will look like:

{
  "name": "db",
  "module": "dbconnector",
  "action": "connect",
  "options": {
    "client": "mysql",
    "connection": {
      "host": "db",
      "port": 3306,
      "user": "db_user",
      "password": "CvP10qzj",
      "database": "node_project_mysql"
    }
  }
}

After adding the pool properties it would look like:

{
  "name": "db",
  "module": "dbconnector",
  "action": "connect",
  "options": {
    "client": "mysql",
    "connection": {
      "host": "db",
      "port": 3306,
      "user": "db_user",
      "password": "CvP10qzj",
      "database": "node_project_mysql"
    },
    "pool": {
      "min": 1,
      "max": 1
    }
  }
}
2 Likes

Thanks Patrick. Tried this, and the schema refresh just hangs then fails so it didn’t work for me.

I made a note in another post in coffee lounge, but you may be the right person.

I’ve connected to Digital Ocean fine with the DB, both a dev and a production DB and updated the DB schema by applying dev changes.

I have now setup a connection pool in DigitalOcean, connected to this - then try to apply changes and nothing happens. It says it’s applied them, but I’m left with a blank schema.

With the schema you mean with the Database Manager in Wappler? Do you use a direct connection there or the Server Connect Connection?

I use server connection.

And when I mean Schema, so I setup the entire DB structure, all tables, and all fields in dev environment - connecting to a ‘dev’ version of the DB on DigitalOcean. Then on the production environment, I connected to a production version of the DB in DigitalOcean - then I simply updated the changes and all the DB structure and fields where correctly updated in the production version - perfect.

Then, I create a connection pool to try to solve the DB connection issues I’m experiencing with Wappler - when I do, and then try to do the same process updating the schema - the update says it’s successfull, bot the structure doesn’t change, it just has one folder ‘tables’.

Noting digitalocean uses pgbouncer for the connection pooling.