2

I have a K8s deployment running in GKE which is connected to an internal load balancer service which assigns an IP address to the VPC subnetwork. When I spin up an individual Compute VM in the subnetwork I am able to access the deployment using the ILB IP address, but I cannot access the deployment within the cluster or from another GKE cluster hitting the same IP address.

I am not sure what I am missing, or if an ILB is not the right tool for this use case. The end goal is to communicate between different GKE clusters on the same subnetwork.

Justin Miller
  • 55
  • 1
  • 5

1 Answers1

0

If you can access from a VM but not from the cluster is strange. The cluster, VM, ILB must be in the same region and subnet.

Also here [1] you could find an example about how to create an internal load balancer for GKE. You can check the example config and your ILB config.

I test this with a curl to ILB and works from a VM instance, from inside the cluster or from a different cluster in the same zone.

[1] https://cloud.google.com/kubernetes-engine/docs/how-to/internal-load-balancing

  • 1
    Thank you for the link, it lead me to my issue. My load balancer had ```loadBalancerSourceRanges: - 10.129.0.0/24``` which was preventing K8s cluster pods from connecting correctly. Taking this out created the expected result – Justin Miller Apr 03 '20 at 16:24