Can you only "update/upload" one or two files in a container or do you to re-deploy the complete project?

Can you only “update/upload” one or two files in a container or do you to re-deploy the complete project?

Docker / Digital Ocean

I have a node project… say about 600 files in total on the complete project… now if i make say 3 changes in Server Connect… can I only update those files tot he server? or do you have to re-deploy the complete project? Or will Docker know… the files to upload that is new? Eg compare and only upload the new ones?

Also… I have a uploads folder on the server… with content already loaded in it… when i click on deploy will it then overwrite that complete folder… thus loosing the already uploaded files …?

HI @Mozzi

You highlight exactly why i stopped using docker

  1. It’s an all or nothing deploy
    2, to mange persistent images on the production target it is pretty much essential to use separate store and S3 uploads (such as spaces)
    I also dislike the ability to easily download files from the remote back to development should i wish to, i like to keep the development and production targets in sync where possible and also like the ability to revert from the production target (yes i know you can use git but that’s just another layer added and reliant on remembering to commit every change)

That’s not really the case.
You just define a users upload folder in the project options. It will be used by docker to create user upload folders that remains persistent.

As far as I understand it, you have to re-deploy the complete project.

I would have thought most people used to FTP would find this quite a disadvantage. I would be interested to know if anyone has suggestions or is familiar with solutions such docker-filezilla.

You can define an uploads folder, yes but the issue of syncing with development remains. Also. To be honest, relying on a droplet storage as the sole repository for site downloads/ images leaves me very uncomfortable.

My scenario is…

I have one “base/master” project… that serves 4 different clients (* each of these clients have their own stlysheets… logos… uploads) … with php i just ftp to each space and uploaded the “updated pages” as i dont want to deploy… 600+ files everytime i make small change! that is insane…

And the pages gets updated quite often and updates only get pushed to each other client once the first site is up and running…

@Hyperbytes … how did you go about this?? did you use FTP to the server?
Or did you not use digital ocean? … did you use a dedicated server with node installed???

Thank you

I use a dedicated vps running node and ftp. I am an “if it works, don’t fix it” type of guy

1 Like

im sure i did see @psweb post a article on how to set this up… it was quite detailed…
Just have to look for it… as it might be the way to go… and not use DO … as this seems to be the solution for my scenario… if my project was different then DO would work wonders… at least i then have control over the FTP of the files… thanks for sharing your solution @Hyperbytes

Pauls tutorial was written using my server. He helped me with the setup!

where is this server hosted? London?

As it would be interesting to see the load speed… and compare it to a local provider (South Africa) eg… Afrihost… @psweb gave me a good explanation about it… Maybe if possible. we can load a file on @psweb server … and a file on your server @Hyperbytes and see the speed here in south africa… and compare them…

PS! I love how simple the DO container is and how easy it is to deploy… but do need the flexibility to only FTP certain files to a server…

I am happy to assist. I do seem to recall paul said my uk cloud based servers were faster than his vps but things may have changed.

1 Like

That is awesome :slight_smile: :slight_smile: thanks @Hyperbytes… yes as @psweb Paul told me the same… :slight_smile: lol… local servers not as fast as international… that is crazy… welcome to africa… :slight_smile:

Just looking for the post he made… if you can find it before me… then please post the link here… and ill do the same…

1 Like

oohh… i see you have load of experience of this topic.

Interesting … thanks that answers all of my questions…

Am using 20i still. They offer both managed and unmanaged vps. The post was after using their managed offerings and they couldn’t get it to work with wappler. Their unmanaged servers however work perfectly when setup via the process detailed in the post.

After researching alternatives I realised their costs were actually very good.

@Hyperbytes … just another interesting article… ill give this a go… maybe this will help…

Does beg the question, if the solution is to bypass docker and use FTP then why use docker in the first place?

I do understand dockers use if its a small site… press “deploy” button and its done… so it super cool and fast…

But i do like the fact that i can control “bigger” sites content… imagine you have a 50MB site… 2000+ files… (pdfs… images… bla bla …) and then you have to “deploy” a fresh copy all the time… that does not make sense to me… if i only need to “push” 1 or 2 file updates…

So here is my two cents worth, from a person who actually did not want to go the docker route originally.

I used to use shared hosting on Afrihost, and for local development I had MAMP installed an running.
Scenario 1: Client asks me to take over a site called apple.com as a silly example
I set up a local environment, in MAMP, that the client can not see, so I can not show them unless i setup a remote staging domain, which can not be called apple.com so I always found this part to be quite a pain, having local, fake remote for staging server, then the real server when the site was actually replaced.

Scenario 2: Have your own private / dedicated server rather where you can make as many fake domain accounts as you like regardless of if you own the domain or not.
Well it was easier as it saved me having to do all the MAMP stuff, it was still a pain with showing the client, I either had to show them via the server IP and a strange URL, or ask them to edit their hosts file to add in a dummy record.

New scenario, use Docker, it saves me the entire MAMP process, saves me the entire VPS, or dedicated server setup, in fact i do not even bother with domain names at all till the very end, so I use Wappler, setup a container, and in about 5 minutes i can see a full local and remote development environment and give my client an IP address that can also see it.
At the very end of the process, when all approved, i can alter DNS records and point the correct domain name to my IP Address of my container, which also means i never needed access to their existing cPanel, I never had to alter their server files, etc. So if something goes terribly wrong, one DNS record change gets them back to where they were instantly.

I can still FTP or SSH into the docker container to make single file changes if needed, docker volumes work great for user uploads.
Honestly the only thing I miss, specifically with NodeJS and not Docker as such is the inbuilt mail server. All the other parts of the process i find better at this stage.