Issue with FTP site publish and large number of files

Wappler 7 Beta 11 and 6.8.0
Windows 11
PHP Local server
Classing this as a bug as i guess it is but I accept these circumstances are pretty unusual.

I am revamping aspects of a site developed by someone else.
I have replicated the site exactly, including data, locally where i test it and wish to deploy it to a staging server for online testing.

The site has a huge amount of user data in it amounting to over 376,000 files and 38 gb of user data

The folder in within the project settings, user uploads definition

I have tried to publish this to a staging server

I get the normal message initially

Then after about 35 seconds, Wappler crashes.

image

I assume this related to some sort of pointer/heap overflow situation due to number of files?

Is there a specific limit on number of files for publishing?
Can this be increased?

How many files are you trying to upload? All new files?

First deploy so all files. I assume the uploads folder will be ignored during upload but publish is still counting them prior to upload?

If ignored the upload folder shouldn’t be counted.

Maybe just retry? Does it happen each time?

Tried about 10 times, same result each time.

I will remove the contents from the folder temporarily and retry.

20 mins while i move data elsewhere.

Have you considered using Rsync @Hyperbytes? Lot faster for such a large Project.

Maybe try to produce a debug log.

I still have to determine if the problem is parsing the large ftp output or displaying it on screen for confirmation.

OK

So i removed the repository folder which holds all the user data and publish works, finding all files almost instantly.

So if i may, here is my thought process

The repository folder is defined as the user uploads folder in the project definition.

By definition, this folder should NOT be uploaded when publishing, effectively being an excluded folder.

However when hitting publish with as large amount of data in that folder the publish process seems to scan the entire uploads folder regardless (hence 35 second wait then crash).

So maybe the files in the uploads folder are scanned and filenames listed i thought.

So i added a test file to the repository folder and hit publish again and i can see that file in the files list

This appears to confirm that when publishing, Wappler does scan all the files in the site INCLUDING the user uploads folder even though those files should not be uploaded.

I assume the crash results from Wappler trying to list the 376K+ files in the user uploads folder even though the folder is not going to be uploads. Not really surprising :grinning:

Does that make sense?

Restoring data then will see if i can get a debug file.

Hmm will check if this folder is really excluded for ftp.

I know for sure that it is excluded for docker, but ftp is a different story

Thanks @Cheese, I will take a look but would prefer to use a wappler only solution if possible. Dont need the user data to be actually uploaded on publish, just need it not to crash wappler.

That could cause some real problems as development data could overwrite production data. That effectively would make publish in FTP environment unusable.

You can always add the uploads folder to the global ftp exclusion folder in Wappler ftp
options

Bug report. Has captured some useful stuff

wappler.zip (1.8 KB)

Meanwhile, i will try the workaround of global exclusion

I need to exclude the /admin/repository folder
Is this correct because Wappler still crashes on publish?

(also tried removing leading "/")

Fixed in Wappler 7 beta 12

Wow, great work guys

Do we have an answer to this because it is quite a big issue?

The folder was excluded from the ftp publishing but there was a problem when also webroot folder was specified. Also escaping of the folder name wasn't really well.

So should be all fixed now.

1 Like