-1

I want to run python startup script inside gcp instance in which i want to get total memory size of gcp instance.

I have tried free -h , grep MemTotal /proc/meminfo commands. But, the problem with this commands is, i get some less amount of RAM than actual memory size i selected while creating instance (due to system use may be). I want to get the exact memory size i selected while creating instance. e.g., "32 GB" for "e2-standard-8" , "16 GB" for "n2-standard-4

Also there is no metadata url available to get gcp instance memory size.

Is there any way to do that ?

Joseph N
  • 540
  • 8
  • 28
  • Please share your python script so we can view what you have tried. Also, where are you executing the script, in the VM or outside GCP? – Puteri Oct 22 '20 at 18:27
  • I am executing the script inside vm. I am using below function in python to get memory size: import psutil psutil.virtual_memory().total I have tried free -h command also. – Joseph N Oct 22 '20 at 18:44
  • I found this [stack post](https://stackoverflow.com/a/2468983/11198184) regarding to get current CPU and RAM usage in Python. This might be helpful – Hasanul Murad Oct 22 '20 at 23:00

1 Answers1

0

If you want to get the RAM as configured when you created the VM in GCP, I think using the Compute Engine Libs would be useful.

First intall the needed libs:

pip install google-api-python-client

If you need it for just one instance this can be helpful:

import googleapiclient.discovery

PROJECT_ID="project_id"
ZONE="zone_where_the_instance_is"

compute = googleapiclient.discovery.build('compute', 'v1')
instances = compute.instances().list(project=PROJECT_ID, zone=ZONE).execute()
for instance in instances['items']:
    ram = compute.machineTypes().get(project=PROJECT_ID, zone=ZONE, machineType=instance['machineType'].rsplit('/', 1)[-1]).execute()['memoryMb']
    print ("RAM -> {:.1f} GB".format(round(ram/1024,1)))

And you'll get:

RAM -> 1.8 GB

A more general approach to get this info from all the instances in a project could be as follows:

First add this additional lib:

pip install tabulate

Then:

import googleapiclient.discovery
from tabulate import tabulate

PROJECT_ID="project_id"

compute = googleapiclient.discovery.build('compute', 'v1')
zones = compute.zones().list(project=PROJECT_ID).execute()
configs=[]
for zone in zones['items']: 
    instances = compute.instances().list(project=PROJECT_ID, zone=zone['name']).execute()
    if 'items' in instances:
        for instance in instances['items']:
            ram = compute.machineTypes().get(project=PROJECT_ID, zone=zone['name'], machineType=instance['machineType'].rsplit('/', 1)[-1]).execute()['memoryMb']
            configs.append([instance['name'],str(round(ram/1024,1))+" GB"])
print(tabulate(configs, headers=["Instance name", "RAM"]))

Which will print something like this:

Instance name    RAM
---------------  ------
satellite        1.8 GB

Those scripts asume you've configured previously the credentials for Cloud SDK.

Puteri
  • 3,348
  • 4
  • 12
  • 27