All the questions about docker you wanted to ask but felt stupid asking

I did warn you yesterday @George, so here they are!

Let’s call it “All the questions about docker you wanted to ask but felt stupid asking” , or let’s just call this “the idiots guide to docker deployment”.

I can’t find any clear guides on this. There are vague references to these concepts but no clear guide. Perhaps some clarity may help lift the suspicion and trust issues some have with docker.

So there are lots of discussions about the benefits of docker but some issues seem to lack a good quality explanation of what happens

So we start with a basic site and do an initial upload to a server. (I use Digital Ocean)

So all the appropriate images are uploaded and the database with contents from your local container

That’s the simple bit

Now you update the code and re-deploy to your server

So what happens now…

Let’s start with your images

It’s a CMS, your live site has lots of images uploaded by users

My belief is that they are overwritten by the local copy?

So the solution is to use a mapped drive so the images are not overwritten?

Is there an alternative? (it’s another $10 per month)

If so, how?

It there any tutorial on this?

How do we deal with the new paths in components like image upload?

How do I set this up?

Assuming the developer has kept the database structures in sync using the database manager the structures of your development and production databases will be identical

You have lots of new database entries in your live site from users uploading information.

So does the deploy overwrite this with the local database?

If so how do you preserve your production data preventing it from being overwritten

I could probably play with this using test cases and work this out but I am sure someone can answer this easily and the answers will be of great benefit to the community (and me!)

I will be honest, not sure I will never trust docker 100%. I am too much of a control freak to leave these things to chance. I guess I will be obsessive about site/ data backups before every docker deploy; ftp is an old technology I get that but with ftp I could control every single element of what was uploaded and know that the database entries would be untouched. Docker, not so confident.

Sometimes I totally screw up a page (yes even I do that :wink:)

At present, if really bad, I would simply download the previous version from the server and start again

How would I do something like that with docker deployment or do I have to also use GIT to get older versions


Really,. not even a comment?

I have a lot… just putting them in order :wink:

I suppose that questions to be effective must be very short and on a single point.

My first question:

How to map a folder outside docker in a droplet to avoid to loose files at each deplyoment?

I was looking for answers, not more questions :wink:


Hi Brian,
I thought the answers of all these questions can be found in:

I’ll try to answer them separately, referring to these docs.

You just define a users upload folder in the project option. It will be used by docker to create user upload folders that remain persistent.

When deploying it only deploys the files. If you want to apply changes to your db schema, use the database manager > apply changes …
George explains it here:

Not sure if this fully answers your questions, but i believe all the answers are in the links i sent.
Also deploying with docker is much easier/faster than dealing with FTP and custom hosting providers :slight_smile:

That is exactly what i wanted @Teodor , pulls so much together in a single reference.
I think the problem was that the information is scattered into small snippets across several posts and so much has changed since some of those original posts.
From the feedback i have been getting I am not alone in needing this definitive guide

I did re-watch the gathering about a week ago but it didn’t really cover the docker issues i was concerned about. Data loss is my greatest fear, I am comfortable with the database manager and growing to like it, my concern was if docker would also interfere with online data
So to summarise

  1. If i set a user uploads folder in the project settings, that folder contents will be persistent on the server through a deploy
  2. Hitting deploy only changes the files, it does not change the database structure or contents
  3. Database structure is changed via the database manager, not by a docker deploy. Docker does not change your data or structure in any way when deploying
  4. A separate mapped drive is not necessary to preserve images on the server from being overwritten if method at (1) is used

If so then docker is a tool I can use with more confidence, not trepidation
Perhaps my search for a new host can end, D.O. may be my salvation after all :partying_face:


One issue @Teodor if i may

The site I am working on is a node makeover of an old site so there are many images to bring over from the old site.

So i have image uploads now being uploaded to /public/assets/uploads/…which is to be path i will set in the image uploads setting

I need to do a one off initial upload to that folder before setting it as the persistent uploads folder in the project settings…

I tried setting the uploads folder to a dummy folder (the uplaods folder can’t actually be deleted through the UI) and this uploads the images as required.

However, if i then set the project settings to the /assets/uploads the images are removed/.
I am guessing docker deleted the persistent volume set when pointed to the “dummy” folder and when I re-deploy it creates a new empty volume., maps this and so the previous images are not seen

How to I do a one off population of the /assets/uploads folder to a persistent volume?

@Hyperbytes i recommend you store your user uploads on s3 (qs public),then store your the file url in the database.

I described the process here.

Sorted via a file copy API action