[Self hosted] Using existing PostgreSQL database as NocoDB primary database (NC_DB)

Hi NocoDB community,

I’m evaluating NocoDB for my company data management needs and have a specific question about using PostgreSQL as the primary database via the NC_DB environment variable.

Question:
Instead of connecting as an external data source (which only sees one schema), I want to use my existing PostgreSQL as NocoDB’s primary database using:

NC_DB="pg://myhost:5432?u=username&p=password&d=existing_database"

What I need to understand:

  1. Metadata storage: When NocoDB uses PostgreSQL as primary DB, what schema does it create for its metadata? Will it be something like nocodb_meta or similar?
  2. Schema visibility: Will NocoDB automatically discover and make available ALL existing schemas in the database, or only the default schema? What default schema will be?
  3. Cross-schema relationships: Will NocoDB properly detect and maintain foreign key relationships that span across different schemas in one database?
  4. Migration considerations: Any gotchas when pointing NocoDB to a production database with existing data and schemas?

Any insights would be greatly appreciated! I have not found any relevant information in an official documentation

Thanks!

In short, the above is not intended use or possibility in NocoDB.

NocoDB will by default use public schema in NC_DB if schema is not specified. It wont discover all the schemas by default. So rest of the questions are not possible given the first two are not possible.

Is it not clear from the documentation about what is possible or not regarding DB management.

How to specify schema during initiation? To add additional argument in NC_DB?

You can use NC_DB_JSON_FILE instead and specify your connection like below. Also please note that ‘my_schema’ should be empty for metadata to be populated.

{
“client”: “pg”,
“connection”: {
“host”: “localhost”,
“port”: “5432”,
“user”: “nocodb”,
“password”: “nocodb00”,
“database”: “mydatabase”,
“schema”: “my_schema”
},
“searchPath”: [“my_schema”]
}

And I gather from your question, that if you are thinking noco as a pgadmin that will discover all schemas and postgres. It is not. We create a virtual nocodb schema from the actual external database. And if you do any DDL on your external postgres the metadata has to be synced
.

@Navi , thanks for the information!

So it is a ‘production’ viable approach to use existing PG DB to host nocodb schema? Somebody uses it if you are aware?

I have already played with external data source connection to our PG (having deployed Nocodb with Mysql for metadata storage) but it does not work for our use case - because it does not read cross schema relationships we do have in our DB (you consider it as a cross database join I guess because each external DB connection to specific schema on the same DB treated as separate DB connection ).

But we do have many cross schema relationships in our PG DB so we need to somehow read it. Others questions I raise are not so important as ability to read cross schemas relationships.

Per my assumption, if we set NC_DB to our existing DB, then we could have each this feature (as it will be only one database connection).

I had the same problems with NocoDB.

  1. I solved the problem of multiple schemas with views: I had created a separate schema with views like SELECT * FROM original_schema.table. Then I connected NocoDB to this schema instead of the underlying tables. Note that views are updatable in PostgreSQL
  2. I don’t have foreign keys from the very beginning since I use table inheritance. So, all my keys are logical only. I managed to generate all necessary records in NocoDB tables about relationships. Now it (more or less) works.

IMO it’s not how it should work. But I haven’t found anything better yet. For instance. Mathesar supports schemas but still requires physical foreign keys. So, necessity is the mother of invention.

I believe that multiple schemas is a valid use case, and patches to implement it should be welcomed. But for now I don’t have enough time to delve into this.