Wondering if anyone has a smart way to find all images over multiple folders and subfolders, with multiple folders nested recursively, in a server action.
My thinking is a Folder List at the root and then some kind of while loop only going through if type = dir.
Tried a few different ways but not quite figuring this one out, anyone done something like this or have a smart idea?
I would like to land up with the paths to all images in a simgle list if possible.
I think you probably right, was hoping to do it in vanilla Wappler, but just can not seem to quite find the right combination to do it.
I almost had it perfect using the arrays, but removing things from arrays always causes me issues, so got stuck trying to do that too through Wappler.
Another way you could do it is to dump a directory, structure (paths), and files in to a CSV via terminal, If I remember you're on a Mac though @psweb? It is quite well documented for Windows and Linux but am sure you could find a simple script out there for Mac usage... Then import it in to your database as a new table. This would then give you a table with all your files and corresponding directories. Or am I missing the point? Which is quite often the case as I go off on one...
As you can see the primary directory can contain a combination of files and/or folders, and each subdirectory could files only, or files and folders. The issue is some folders may nest 2 dirctories deep, while other may nest folders 5 directories deep. I am just not sure where the user may choose to upload files into.
I would like to have a list returned of all file paths in the end in one big list, regardless of what folder the image resides in.
Lol, that would certainly work, and you are correct, i am on mac, but that sadly would only be a 1 off solution, and this is something i will need to run a couple times a day, so i need the script to look through the live uploads directory to fetch the list of files, so it can then go and add watermarks to either images that have never been watermarked, or to only newly uploaded files, which each user uploads daily to unknown subdirectories.
We use Amazon S3 and its immense API to do all types of wonderful things. May be worth investigating. The basics are already provided for in Wappler but there are many many features that are not.
I tried S3 a few years back, but it looked like only Einstein could understand such a complicated thing, maybe i will take another look into, i feel like I have gotten smarter over the last few months, so maybe i am ready for it.
A couple of YouTube videos will springboard you in to action Paul. Permissions can be a little bit annoying to configure initially but Wappler has you covered for the most part. Also using the AWS CLI you could probably bulk import your current structure and files directly over to S3. We did this when we went from our dedicated servers over to AWS. Took five minutes to transfer GB's of files! Plenty of guides available on that side though should you wish to undertake this method.
Actually,playing about with it, chatGpt will write you a script to recursively scan your folders and sub folders and write the results to a db using knex but it may be easier, i think, to return the results as an array and parse that in wappler.
It contains subfolders of the directory you are in, but if you then want to recursively check inside each folder and grab all the filenames from the files, as well as follow the nested sub folders inside those sub folders it becomes a bit tricky, i mean i could do like 4 levels of repeats, but i just figured there should be a way to do it with less repeats, or a while loop, and still be able to go even 10 nested sub folders deep if needed
Thank you Brian, what a massive help it is, I have now added it to 5 different projects in various ways to replace multiple repeat structures that are no longer required with this in place.
I am just about finished the file filtering in the directory only search but think i have found a bug, if you remove the file type list the remove values are still remembered and used