How to upload image with a url in the path

I have a file which I am need to get from a subdomain eg, images.thesite.com
With he upload it is done under the domain eg, staff.thesite.com
The website is running under the main domain eg. thesite.com

The domain staff.thesite.com is locked to only be allowed from a certain IP address, which is why I need to keep this seperate.
However when I add the https://images.thesite.com/public/assets/images to the file upload then it fails with an incorrect path.

Is there anyway around this to get it to work?

Are these separate codebases?

yes, they are seperate nodejs sites.

This issue I have is that the staff.thesite.com needs to be locked to only allow from certain IP address, due to data.

But I need the image showing in the public side of the website, which is why I came up with the idea of running the images to a seperate sub domain.
This way if someone views the source then the staff.thesite.com is not shown in the code but images.thesite.com is shown

The industry-standard approach is to send the image to the images app through an API Action. I know Wappler API Action and images don’t play that well, but that’s another discussion

A less recommendable approach (but probably faster to achieve) is to create a symlink, so any files of the website staff.thesite.com/public/assets/images are symlinked to images.thesite.com/public/assets/images

Assuming you’re using a shared hosting control panel, a symlink can be created through their file manager, or through SSH

no not on shared hosting, we are running our own server, which is ubuntu with docker and the control panel is aapanel.

Ok, then you have to resort to either passing the image through an API Action (unsupported by Wappler), or use a service like S3, or even deploy your own S3 with MinIO Docker image

yeah I figured it would not be easy, now where to start on the S3 lol - cheers for your help :slight_smile: Ill have to start looking into it.

1 Like

We were in a similar position Peter, and went with Amazon S3, hosting a huge amount of images, at very low cost. Can also get two birds stoned with one joint by implementing a handy little CloudFormation script to handle all resizing dynamically within the URL, something like https://gunnery.s3.eu-west-2.amazonaws.com/14824342/320x240/image1.png or alternaive size as simple as https://gunnery.s3.eu-west-2.amazonaws.com/14824342/1920x1024/image1.png. Can simply store the thumb and large dimensions in your db and call as required. S3 is CHEAP even for GBs of data, even TBs of data as one of our Clients currently has stored.

S3 is so simple to set up but you will need a valid card to register for an account. Several GBs should still see you eligible for the free tier too! More space is cheap as chips mate. Wapplers integration with S3 is probably one of the easiest things to do API wise, literally a key, crate a Bucket, point to it.

Here is the CloudFormation script we use for the dynamic resizing:

https://docs.aws.amazon.com/solutions/latest/serverless-image-handler/template.html

And here is an easy to follow video on how to set it all up:

Makes life so easy taking care of all those image sizes mate! Want a new resolution, simply include it in the path within the URL! Really is that easy Sir! Tiny cost to the Client but massive headache solved (opt for Virgina for the cheapest rate, although its only Cents difference to other regions for S3)...

:wink:

2 Likes

but how will that work with images that are uploaded with file upload in Wappler? Can it handle file name changes?
I’ve set up a bucket but I can’t see where to access it from in Wappler, either that or Im having a dumb moment.