Securing Public User Uploads Files

Hi.
I have a project with NodeJS/Digital Ocean/Docker setup.
In my public user-uploads folder, I have a folder “downloads”, which contains files/zips that are generated on demand and can be downloaded by a logged in user.
But, the issues is that these files are not really secure.
Even if I use the server side file download mechanism, the actual files are still exposed.
Anyone who can guess the path of said files, can download them.

I have handled this before using htacess file, but it does not seem to be working here.
This is also the method suggested in the community in a few old posts.

I also tried creating a custom route, but the actual file path gets resolved before my custom route, hence file gets downloaded. Although this is not really a solution since security restrict & DB etc are not directly available in custom routes, but was just trying to see it it works.

Can anyone please help me figure out how to secure these files from direct access?

Hi Sid

I solve this problem by uploading files to a folder located outside the public area. The very scheme of protection and access to files is described by me here on the example of images: How to display a dynamic image in nodejs?

But it works with any files.

Our naming conventions are generated randomly (never use anything like image1.jpg, image2.jpg etc), for important information or private we add the files to password protected .zip files (can be brute forced but the passwords are long and random), htaccess for host only access is applied by default to all directories (anti leaching) on the host. PDF’s are always secured with passwords prior to uploads, then zipped and password protected upon upload. For very sensitive information we simply don’t upload or store it on our ‘public’ environment, these items are stored in a secure repository off site and on alternative locations.

Hi @Mr.Rubi. If I upload the files outside the public area, each deployment would delete all such auto-generated files - I need them to persist.
Knowing that, I tried configuring this via file download step… but it throws 404 for files outside public folder.
So in both fronts, this idea failed for me.

Interesting. Any ideas how to do that with Docker/Digital Ocean?

In my case, its not very sensitive info. So no point doing the extra work of doing an off site configuration like S3.

This is easily solved by adding a folder to the list of folders with volumes in the docker-compose file.

How are you trying to access a file that is outside of the public domain? If a special server action step called “File download” is used for access, there should be no problems.

Could set up a second private Docker container which only allows connections internally from the public Docker container? Theoretically only connections coming from the public container can access the private container? Easy on Amazon as plenty of guides not so sure about Digital Ocean though?

Explained here:

Edit:
S3 storage is incredibly cheap @sid and very configurable, well integrated with Wappler, and has many advanced security options. Also removes the pain in the arse (obviously a personal opinion) requirement to be within your Projects working directory to make use of Wappler file/image upload/manipulation actions. We store GB’s of data on S3 for a couple of Dollars a month.

I am still a novice with docker. Can you please share some doc? Or code sample?

No. Its in the same domain.

That is eactly what I used. Maybe if I do the first step - adding to docker-compose, it would start working.

It all sounds easy theoretically. But not for me right now.

Data I need to store is just about 100-200mb at max. So trying to avoid a call outside the server.
But will definitely keep this in mind for future. Thanks. :slight_smile:

Then that would be eligible and well within the free tier offering Sid.

1 Like

I’m sorry, Sid, that I’m answering a little late, it was a fun weekend, I came to my senses quite recently… :smile:

The wappler can solve this problem without interfering with the docker-compose file at all. If you open the project settings, you will see the folders for saving files field:

Note that the folder is located in the root of the project, and not inside the public folder:

After that, a volume binding record will be automatically added to the docker-compose file:
6

The need for manual intervention in docker-compose will arise only if you have already created and are using a folder for download files, but you need another folder. In this case, you will have to make an entry of the volume binding in the docker-compose file yourself.

The only way to reliably protect files from public access is to place these files outside the public folder. Therefore, you need to create a folder for download files outside of the public folder.

Yes, your train of thought is correct. 404 is a “Not Found” error. Most likely, it occurs due to the fact that since the folder with files does not have a volume binding in docker-compose, after you deploy the project on your host, files from this folder are not transferred there and it is empty. For this reason, the error 404 is returned, because there are simply no files.

1 Like

I made little sense of these entries in docker-compose. It seems a bit more clearer now with your explanation.
I do need the user uploads folder to be a different one & inside public, so can’t change that.
But adding another folder in volume in this way is something I will try out soon.
Thanks for taking the time to explain. :slight_smile:

1 Like

Thanks @Mr.Rubi. Finally got to trying this and it worked.

I am writing the exact steps and marking this one as solution, just because this post has a few more details. But solution to the original problem is as suggested by Mr.Rubi.

A gist of the requirement: Have a folder in your project root, which you want to access internally, without exposing to public, and keep it persistent across deployments as well.

For Local Docker:

  1. Open the docker-compose.yml file in .wappler/targets/xxx/
  2. Locate the volumes section and add your folder, downloads in this case, as shown below: image
  3. You should now be able to read and write from the downloads folder in Server Actions, but there is no direct URL with which content inside this folder can be accessed.

For Remote Docker:

  1. The docker-compose.yml file location remains same, but the structure looks a bit different.
  2. The volumes section in web would have just your user_uploads folder’s definition.
  3. Add downloads folder there as shown below:image
  4. Also, you need to define the volume in a section that is on the same level as web, as shown above.
  5. Now, when you deploy, another volume by the name of downloads will get created and you have same access as described above.

This solution works well for me so far. If anyone has a better solution, please do share.

2 Likes