Let’s say I have a number of images, e.g. www.example.com/avatarimage_1.jpg. I want to save these images so that I can then store them in my own S3, and then my own path in the DB.
Is there anything native to Wappler that would allow me to download the image from the above url path so that I can then upload to my S3 storage?
Can drop all the images into the Bucket in a few seconds. We did 15gb this way in around a minute a few days ago. No drops, very fast.
Otherwise you could create a Server Action that copies all the images on the host to a directory, then add an S3 upload repeat to push them to your Buckets… Should not be too difficult to do as long as everything is on the same server and within the same directory of the Project. Can’t traverse out of the Projects parent directory using the SC file actions. That would be the only thing denying this approach. May have to up your timeout and memory allocation if your’re using PHP by adjusting your PHP variables though…
Quick search shows up a tool called ParseHub which would allow you to scrape those images… Not sure if it is ideal, maybe there is a SC way of doing it. Worth asking @George
I’ve not used it as we wrote our own scraper for similar purposes for Property images (long gone Project). We pretty much scraped all of Booking.com Spain and Portugal a few years back.