Trouble connecting to gcloud postgres instance

This seems really basic. I whitelisted nocodb’s ip address in my gcloud instance, and I’m providing my db’s ip address and other connection info. I have successfully connected in this way using, but when I try this in Nocodb it times out and shows “Network Error”. Am I missing something?

1 Like

From what I remember - GCP Cloud SQL has some additional socketpath. We are using knexjs underneath to connect to SQL. If you can figure the config for knexjs and get it to work. The same config can be added under ‘Edit connection json’ as below

Thanks for getting back to me.

I’m not using js in my project, but in GCP I have a public ip enabled, and I’m able to connect to the database locally using psql. SSL mode is disabled, so I’m connecting successfully with just host_ip/port/user/password/dbname , but in Nocodb I still get the same issue.

In my project code I’m connecting using the Python Cloud SQL Connector library Conéctate mediante conectores de lenguaje de Cloud SQL  |  Cloud SQL para PostgreSQL  |  Google Cloud, But I don’t expect I would need to do a similar approach from since psql is connecting just with the ordinary credentials.

I would appreciate any other debugging tips you can provide.

Hi @navi I successfully connected after I enabled all incoming connections through GCP’s firewall. Obvs that’s not great and I’d like a long-term solution, so I’m wondering if the IP address shown in the app’s connection setup panel could be wrong?

That’s good to hear you managed to connect to it successfully.

Quick question : For a future solution - do you think you can run a nocodb-agent as a docker container on GCP such that only nocodb cloud can connect to it? That way all your dataplane remains with you and our cloud just connects to that agent. And you probably do not have to even provide credentials on our cloud.

Happy to learn more from your usecase - Im available here for a chat : Calendly - Naveen

Thanks for the suggestion. But I actually found the client IP address in the server logs and it was indeed different from what the nocodb config told me to whitelist. I whitelisted it and now the connection works. Do you know if this might be a dynamic IP address and I will need to keep updating the whitelist?

Can you please share the IP address that you are seeing in the logs. A team member is looking at this currently as well.

Sure, it’s

Hi, yes the IP you mentioned is ephemeral and would keep changing. There was an issue on our end where the displayed IP was not honoured. we have fixed the problem now and it will be static.

Please whitelist “” and it should work and remain static!

Appreciate reporting it to us!

1 Like

I’m running into this exact same issue today on my Cloud SQL Postgres instance. I know all my credentials are correct, is whitelisted, and still getting “Network Error.”

Additionally, connected to the DB via a local version of Noco installed via brew. The only thing I can think of is the whitelist IP not working still.

Hi jpfm,

I crosschecked and confirmed that our outgoing IP is configured correctly. However we tried to dig through logs to find any hints why there would be “Network Error.” and found an error below.
“Timeout acquiring a connection. The pool is probably full.”

This is potentially because of database running out of all allowed connections. Could you please check the same and confirm thats not the case?

On our side, we are fixing the UI to show proper error message to avoid this kind of confusion.

thanks and regards

Hey! Got a DM from @navi this morning about a bug fix and it now works with SSL off. It did surface another issue when I now tried with SSL on.

Not sure if another bug, but it now throws this error: “Hostname/IP does not match certificate’s altnames: Host: localhost. is not in the cert’s altnames: …” The IP I used is not localhost of course, it’s the public IP of my postgres server.

Thank you so much for your help. I’m so excited to start building my project.