Very Long Running PHP Scripts - How to Make it Work With Wappler

Hi.
I found this post relevant to the point I wish to get some insights on: Long server action freezes up the entire website

Similar to what is described in the post, I too have created an extremely long running SA which synchronises my DB copy of Shopify product data. It may take anywhere from 3 to 10 minutes to complete.
The concept is pretty much similar:

  1. Fetch a set of product IDs from DB.
  2. For each ID, hit Shopify API to fetch product variant information.
  3. Upsert the response in another DB table.
  4. Keep repeating for each ID until done.
  5. Run this script periodically using CRON jobs.

While running this, what I discovered is that after 120 seconds, I was being returned a “connection timeout” error from the server.
I ran this SA directly in the browser/postman as an API call to test and also via cURL in terminal.
After a long discussion with my hosting provider, I discovered two things:

  1. There is something called webserver timeout, apart from PHP’s max execution timeout setting.
  2. An API call should not last more than 120 seconds - 2 minutes - as a recommended approach. Nothing wrong with it itself, but a long running call means blocked resource for other API calls and requests on the webserver.
  3. I played around with script timeout settings in Wappler SA, as well as PHP INI settings - but at the end, it was web server timeout that made the difference.

So here’s what I would like from the team and community to help me with:

  1. Has anyone had any experience with running long scripts in PHP via cron job or API call with Wappler or otherwise? What do you recommend?
  2. Is there any way to run a SA PHP file directly from terminal, without having to use cURL? SA provides an extremely easy and secure way of building out complex stuff. So I wish to use SA to build my set of tasks and not use core PHP.
  3. Is there any other way to handle the task of synchronizing data from a 3rd party API (like Shopify) into my own DB other than long running PHP script? I have though about breaking it into batches, but could not figure out a Wappler-based way of doing it.

Using PHP & Wappler are constants - I would not go around this unless the other method is extremely rewarding.

Hey Sid,

The first thing that comes to mind is using webhooks from the external source, such as Shopify, to populate the data in realtime. Of course, you may already be doing this but still looking for a way to clean up any issues that might result from missed hooks, etc.

I don’t think so.

I think this is the best approach and can be done with Wappler IF this is just a daily clean up when combined with webhooks. I say IF because this method won’t run quickly, but does meet your requirement of being Wappler only.

Cron job: build_batch – run once per day at the beginning of the “processing window” (eg from 1am to 2am)

  1. Fetch a set of product IDs from DB.
  2. For each ID, make any entry in a queue table (perhaps as simple as a list of ID’s)

Cron job: run_batch – every minute during the “processing window”

  1. Do a single query of the queue table
  2. Hit Shopify API to fetch product variant information.
  3. Upsert the response in another DB table.
  4. Remove ID from queue table

Basically, the first cron builds a table of entries to be processed, and the second cron processes one each minute. This is obviously slowing down your process, which is why it really could only be used in conjunction with webhooks, which would serve as the primary source of updating.

If you remove the requirement of Wappler only, you really need something like RabbitMQ. Sounds like a great module to be created!

Hello @sid
I’ve done a little work with shopify before. I had the same problem. You need to know the relational structure of the tables in the Shopify database. For this reason, there is a lot of slowing down.

Thanks for taking the time to answer.

I am looking for a universal kind of solution as I have many other 3rd party services to integrate.
I am not yet using Webhooks, but will look into it too.

Thats not a bad idea to be honest. I am not looking for extreme real-time data to be honest. I can just let the cron run every 5 seconds or so as that is the max amount of time it would take to process a single ID.

I still haven’t been able to understand how RabbitMQ works and how to integrate one in any prpject/server. I beleive I haven’t been successfull also because I mostly live in shared or managed hosting. I don’t understand much about the raw hosting parts of web.
Would be a great project indeed.

I think cron maxes out at once per minute…but still if real-time doesn’t matter and it keeps things simple, then great!

Hi Serhat,
Shopify is indeed a vast pool of data. Luckily, I just need a few of the data points to be syncronized.

Running PHP file is not an issue as such. The problem here is running the SA PHP file - because they are designed to run as APIs and not a direct PHP script.

Ah! Yes. Did not know about that, or maybe just skipped my mind.
Can probably configure a bigger batch be processed rather that 1 id at a time.
Or try to configure an infinite script with sleep.

1 Like