Long server action freezes up the entire website

I have one server action that takes about 2 minutes, which is correct as far as I know. There’s a lot of steps it goes through.

But during the running of this server action, no other traffic is possible on my site. All pages are “pending” as long as this server action is running. This is probably supposed to happen, but is there another way to enable simultaneous processes? It’s probably some server related setting, but not sure what it’s called.

That’s quite a lot of time :slight_smile:
What’s your server action doing for whole 2mins?

For long back-end processes, you’ll want a way to hand-off the processing and let the user know when complete.

For example, you can drop an entry in a database that has the necessary information for running, and have a cron job run every minute looking for entries needing to be processed. The cron job can just call a server connect file to process. You then mark it as complete so it doesn’t run twice.

–Ken

1 Like

In short:

  1. API call to retrieve amount of pages
  2. Repeat those pages
  3. API call to retrieve records of each page
  4. API call to “enrich” each individual record with data from another API

This is how I’ve built most of my server actions, which I use to “synchronize” data from lets say my customer service software to our own database. I’m using Stitchdata.com as an intemediary to get the data to our datawarehouse (PostgresSQL).

I’ve built it like this with help from you Teodor, some months ago already. They apparently take time.

As a side note, I’ve had someone else build one of my server actions in pure PHP, but this went quite a bit faster. I think it was factor 2 or 3 faster. I might be able to build the server action more efficiently, but I’m also quite sure I did it in the most efficient way already.

  1. Repeat 10 pages (each page has 50 records)

  2. Repeat the records of each page (10 * 50 records)

  3. Another API call (based on the order ID) to enrich the data that wasn’t available in the repeat records step (500 API calls)

Could I have done anything more efficient @Teodor ? If not, can you confirm that it should take this long? Or is there anything that can be done on Wapplers side to speed things up?

Thanks for your help again Ken. What exactly do you mean by “necessary information”? Do you mean like to store the command…

curl http://example.com/dmxConnect/API/script.php

…in the database, and let another Cronjob check it each minute?

I just mean anything you might no longer have since the request is detached from the execution.

For example, if this long process relies on POST inputs, then you would want to store them and pass into the execution.

As for the curl command, that depends on how many of these you have, etc. If it is just one process, then you could just hard code the cron path. If there are many, then maybe you want only once cron job that fires off different api calls. All depends on your needs.

Hi Nevil,
Do you actually need the output options enabled for all the steps it’s enabled for?
I mean - do you use/show this data on some frontend page, or does this server action need to just on the background without displaying/outputting any data on the page?

You should minimize the API calls, 500 is really a lot. I don’t know the API that is used, but isn’t it possible to get all the pages with the records already in them, like a database join. That would speedup the action enormous.

Hi Nevil,

What you are experiencing is actually is Session Locking in PHP. It just locks other requests in your browser till the PHP action is done. Other browsers and users of your website run just fine.

See for more info:

@patrick will see if we can improve Server Connect session management - so that locking time is minimized and you will be able to run multiple Server Actions concurrently.

Have an update here that you can test. Replace the Session.php in dmxConnectLib/lib/core with the one from the zipfile.

Session.zip (470 Bytes)

I do not show it on the frontend. If I disable output of the variables, the final API destination doesn’t get any data. I uses {{repeat}} to fetch an entire JSON array an use it within the final API step.

They have an order list, but it contains only bare minimum information (repeater: qls_data). That’s why I need to query each order ID seperately to get things like the shipping address, products ordered, etc.

Will try and let you know!

@brian, @ben, @mebeingken, @Dave, @sid

Could you check the above new Session file by Patrick - to see if it works all well with your servers and session handling remains well?

This will actually increase greatly performance of server connect - as no session locking will occur now! And your page can have multiple concurrent Server Connects running at the same time!

You will see this specially when you want to load multiple Server Connect datasets initially from the same server.

Works beautifully!! I can open a page on the website while the server action is running. What have you changed?

Well we limited the session locking to only when needed - initially when the sessions are read and also on each set/remove session action.

Thanks @George and @patrick

Will upload now and see how everything goes and report back.

I have tested the update without a problem. To be honest, with the simple sites that I build, I have never had a problem as described by @nevil, hence I cannot comment on performance.

Great to see that @nevil is happy, which makes me happy too.

Great work @patrick, :congratulations: on a job wel done.

3 Likes

All good here, fresh as a daisy! :slight_smile:

1 Like

Hi. The big projects I work on are on ASP.NET.
Tried on one of the relatively larger PHP projects I have (e-commerce project in development)… no issues so far. :+1:
I haven’t had the requirement of such long-running repeats yet either. Probably now I can use that in future. :sweat_smile: