Advice on moving pictures to a host for CDN

Hi Wappler nation. Just asking a bunch of Qs - please answer what you can. Thanks!

NOTE: Our App, while accessible worldwide, is mainly used in the Southeast United States. It will be heavily hit during a one-week period in October. Continued use may occur, but nothing like during the event where we want to provide speedy access to pictures. Thats why I’m thinking of a CDN. We will have between 2500-3500 images that will be repeatedly viewed.

Currently, our wordpress site allows users to upload car images (pics) as part of a form entry.
Those pics are stored on my host (1and1) in the normal Wordpress “uploads” structure.
Shared sites like this are crammed on servers for lots of websites, so i’m at the mercy of their bandwidth and cpu. Thats normal.
But I’d like to explore moving my pics into a more formal CDN using Fastly or Imgix, for example.
(main reason is for dynamic sizing so I’m not managing multiple thumbnails)

From what I read, Fastly/IMGix sits “between” my raw pics and the enduser.

  1. Does this mean that my “raw” pic has been copied (duplicated) from 1and1 and resides on Fastly? If so, then I SHOULD NOT be worried that the server might be under heavy load (ie. slow) as that won’t affect content delivery to an enduser, right? So pics could stay on 1and1.

  2. Since I’m now using Digital Ocean for my live Wappler site, should I move all my pics to DO?
    If I’m going to use Imgix, do I even need to do this?

  3. What is this service on DO called? I see I can add more block storage to increase my droplet storage. But where/how do I copy a bunch of pictures to DO? I have storage containers in Azure but don’t see a similar thing in DO. (again, I may not need to do this based on the CDN)

  4. I’ve always heard that AWS was more expensive than most other things. Yet people here talk about S3. Whats the benefit of my pics being in an S3 bucket?

Hi! I’m using Digital Ocean spaces. It’s cheap and easy to use with Wappler’s S3 connector. It has CDN as well

thanks for replying.
With respect to pictures, does it have scaling and cropping functionality like Imgix?
Can you point me to some documentation on this? Because when I search for images on digital ocean, I get info on “disk images “ not picture images.

It’s just to store files such as images and deliver it, no cropping etc.

The service is called “Spaces” in Digital Ocean

AWS storage is cheap as chips, unlike a lot of their other services which can really mount up. We moved tens of thousands of images from one server to AWS using the AWS CLI. Simply pointed it to the images we wanted to move, the full contents of the directory in this case, pointed it to the AWS Bucket we had set-up, and hit enter. Took a few minutes to transfer the entire directory contents. A simpler option is to do it locally (example link on how to do this here, but there are many guides available on this subject), so download the images to your local machine, this avoids some issues with various hosts and permissions/security/firewall rules/policies etc. We then used a simple Cloud Formation script on AWS to carry out the re-sizing of images on the fly, video here which will guide you through, and explain, the process and what is going on…

Advantages of this is that storage is very cheap. No need for complex operations in re-sizing images, and you can maintain the integrity of the original image. The use of the script is free, and the processing doesn’t cost a penny.

Thx
I just read the Serverless Image Handler guide.
Have not watched that video yet.
Are you saying that you created all the necessary thumbs that you require, rather than use the “fit-in“ option on the image URL?

I might have only 2500 images, but those are gonna be accessed 5-10,000 times each during my event. Definitely seems like pre-making the thumbnails is better than the on-demand sizing of them over and over.

In either case, this appears to make imgix/fastly unnecessary, as you said.

Simple as specifying the bucket url and in the path a size you specifity, example, /1024x768/, or whatever size you want, can store these values in the DB for thumbs etc. 5 - 10,000 times a day is not that much in the grand scheme of things where AWS is concerned.

We don’t manipulate any images, we use it for property inventory images, the originals need to be stored as can be called upon in a legal sense if there are disputes, and besides we didn’t want to be creating and storing tens of thousands of thumbnails every-time an image or images are uploaded…

Just an idea that thought may help you out.

“ We then used a simple Cloud Formation script on AWS to carry out the re-sizing of images on the fly”

Ahh, I got a broken link when I went there to view the “template”, so I didn’t fully understand.
And yes, I’m not keen on making thumbnails, so what you describe is fine with me to request a size similar to what you would specify on the image url on those other options.

I’ll check this out today.

The video correlates with the file. Explains how to set everything up. Its not that difficult, on a level of 1 being easy and 10 being difficult, it is probably a 1 - 2 :slight_smile:

To grab the template simply click on the View Template button. As I mentioned the video explains everything. Really is a nice solution and seamless. We store any resolutions we want to use along with the image data in the database. We usually store the S3 bucket URL in the database as well. Keeps everything dynamic.

Personally I like Gumlet.
Simple setup, works really well, serves scaled, optimized images and in a good SEO format of WebP.

I use digital Ocean with docker, I don’t use spaces most of the time, I just have an upload volume and make the container size large enough to take all the images.
An upload server action can crop and resize to whatever size you want, generally the biggest size.

Gumlet free version is often enough to suit my needs, because it serves scaled per browser width images even if you have 10mb of images on the page for 5k retina and a person with a 13 inch views your site, gumlet will serve them 1mb of images at full quality still.

The way it works is it downloads all your original images from your server, then serves multiple sized cached versions with their CDN. So your bandwidth is saved up to 80%

Depending on size of your images, me personally we work with really really large images, (we are photoshop company, so image editing)

The approach I’ve taken with storage can be bloody expensive when using large file formats (and lots of them). so im moving toward wasabi (5$ a terabyte), s3 compatible, and then using a volume cdn (not super low latency but good for large files and much much cheaper, via bunny volume cdn/