How would you look for JPGs in multiple folder/subfolder structures recursively

Wondering if anyone has a smart way to find all images over multiple folders and subfolders, with multiple folders nested recursively, in a server action.

My thinking is a Folder List at the root and then some kind of while loop only going through if type = dir.

Tried a few different ways but not quite figuring this one out, anyone done something like this or have a smart idea?

I would like to land up with the paths to all images in a simgle list if possible.

Sounds like a great use of RunJS...the internet will likely give you a nice script that can do this.

1 Like

I think you probably right, was hoping to do it in vanilla Wappler, but just can not seem to quite find the right combination to do it.
I almost had it perfect using the arrays, but removing things from arrays always causes me issues, so got stuck trying to do that too through Wappler.

Maybe provide an example structure and what you want to achieve, I think it’s perfectly doable using folder list and repeats

Another way you could do it is to dump a directory, structure (paths), and files in to a CSV via terminal, If I remember you're on a Mac though @psweb? It is quite well documented for Windows and Linux but am sure you could find a simple script out there for Mac usage... Then import it in to your database as a new table. This would then give you a table with all your files and corresponding directories. Or am I missing the point? Which is quite often the case as I go off on one... :smiley:

Thanks Teodor, at the moment i have a folder structure like this

As you can see the primary directory can contain a combination of files and/or folders, and each subdirectory could files only, or files and folders. The issue is some folders may nest 2 dirctories deep, while other may nest folders 5 directories deep. I am just not sure where the user may choose to upload files into.

I would like to have a list returned of all file paths in the end in one big list, regardless of what folder the image resides in.

Lol, that would certainly work, and you are correct, i am on mac, but that sadly would only be a 1 off solution, and this is something i will need to run a couple times a day, so i need the script to look through the live uploads directory to fetch the list of files, so it can then go and add watermarks to either images that have never been watermarked, or to only newly uploaded files, which each user uploads daily to unknown subdirectories.

1 Like

We use Amazon S3 and its immense API to do all types of wonderful things. May be worth investigating. The basics are already provided for in Wappler but there are many many features that are not.

1 Like

ahh, makes sense, this isnt even a DO Space, so using docker volumes to store uploadas at the moment.

1 Like

S3 is CHEAP too! We rarely go above €4 a month with tens of thousands of files hosted in hot and cold storage. Yay I made a good idea! :rofl:

1 Like

I tried S3 a few years back, but it looked like only Einstein could understand such a complicated thing, maybe i will take another look into, i feel like I have gotten smarter over the last few months, so maybe i am ready for it.

2 Likes

Custom extension based on this?


const fs = require('fs');
const path = require('path');

function listFiles(dir, fileList = []) {
    const files = fs.readdirSync(dir);
    files.forEach(file => {
        const filePath = path.join(dir, file);
        if (fs.statSync(filePath).isDirectory()) {
            listFiles(filePath, fileList);
        } else {
            fileList.push(filePath);
        }
    });
    return fileList;
}

// Replace './your-directory' with the path to the directory you want to scan
const files = listFiles('./your-directory');

// Write the results to a text file
fs.writeFileSync('file-list.txt', files.join('\n'), 'utf-8');
console.log('File list saved to file-list.txt');

(From chatGPT)

1 Like

A couple of YouTube videos will springboard you in to action Paul. Permissions can be a little bit annoying to configure initially but Wappler has you covered for the most part. Also using the AWS CLI you could probably bulk import your current structure and files directly over to S3. We did this when we went from our dedicated servers over to AWS. Took five minutes to transfer GB's of files! Plenty of guides available on that side though should you wish to undertake this method.

1 Like

Thanks Brian, will give that a try and see how it goes.

Thanks Cheese, happy to hear Wappler takes over some of the process, normally makes things a little easier.

Actually,playing about with it, chatGpt will write you a script to recursively scan your folders and sub folders and write the results to a db using knex but it may be easier, i think, to return the results as an array and parse that in wappler.

1 Like

Does Wappler folder list step not contain subfolders?

It contains subfolders of the directory you are in, but if you then want to recursively check inside each folder and grab all the filenames from the files, as well as follow the nested sub folders inside those sub folders it becomes a bit tricky, i mean i could do like 4 levels of repeats, but i just figured there should be a way to do it with less repeats, or a while loop, and still be able to go even 10 nested sub folders deep if needed

For info of anyone following, i have developed a node custom extension for this for paul which i will be releasing soon as an npm

3 Likes

Thank you Brian, what a massive help it is, I have now added it to 5 different projects in various ways to replace multiple repeat structures that are no longer required with this in place.

I am just about finished the file filtering in the directory only search but think i have found a bug, if you remove the file type list the remove values are still remembered and used