Self Hosting NOCODB

If we download the NOCODB software, install and completely SELF-HOST NOCODB server ourselves in our own server OR using a cloud service provider, can we then

  1. Map and store the images in our own cloud bucket (like AWS S3 / Google Cloud bucket) ?
  2. Store all data records in our own postgresql instance hosted by us in cloud? (this part I am sure is feasible)

Thanks in advance for your response/suggestions.

hope this helps…

  1. you can enable storage plugin at App Store and then all attachments will be stored in that.

  2. Data sources overview | NocoDB

Thanks, Rajanish. Appreciate your quick response.

Thanks @sairam.kuppusamy for posting your question here from the email.

To further clarify the answer from @rajanish with context in your email. If you intend to map an existing file in s3/azure to a row it’s not possible. However, you can easily download your files and upload it back to the rows after having set the storage credentials (search for s3 related)

Thanks Navi.

We are trying to oversome the limitations on storage size (25GB) as our business deals with high resolution CGI images / Videos rendered from 3D files. So we quickly consume a lots of space…

If I understand this correct, I will be able to do the following:-

  1. Setup the environment variables (storage) to point to S3 /aZURE bucket / folders. This setting will control the storage of attachments
  2. Then manually upload images into attachments which will get stored in above (s3/azure) destination as configured
  3. I need to do a 1 time exercise of downloading all current attachments and upload them using some script/automation/manual. (As there is NO solution to map attachments of existing rows)

I could see your reply mentioning only S3 and AZURE. So is Google Cloud Storage supported?

Thanks a lot

It appears we support. Did you try it ?

Hi Navi, Thanks for your response.

I am still stuck with the following step:-

  1. I have created a postgresql instance/database (root_db) on Google Cloud Services (CloudSQL)
  2. Then I try to create a Service using Google Cloud Run (using the docker image) and I also add 2 environment variables NC_DB and NC_AUTH_JWT_SECRET and then create the service.

Step 2 gets the following error:
Error: Error: Configuration missing meta db connection at NcConfig.create (/usr/src/app/docker/main.js:2:1871941)

If there is any guidane/write up on achieving the following please share links. Thanks in advance

We want the database to be our own (postgresql on Google Cloud Service)
We want the images/files to be stores in Google Storage.


It looks like the error occurs when database name is missing in the connection parameter, Can you double check the NC_DB value and if possible share a sample connection url similar to yours with dummy value.