I recently learned about python vars() and it's ability to interpret a string as a variable. I want to be able to programmatically call on variables/dictionaries by adding/subtracting and using strings. Here is code that I wrote to test out how it works:
var1 = 3
var2 = 5
var3 = 7
#....
var_string = "var1"
x = vars()[var_string] - 1
print (x)
If I run the code, as expected it interprets the string "var1" as the already defined global variable var1 = 3. X inherits var1's value and subtracts one, printing out 2 as desired. However, the moment I take the same code and put it into a function, it stops working and returns a key error.
var1 = 3
var2 = 5
var3 = 7
#....
def my_Function():
var_string = "var1"
vars()[var_string]
x = vars()[var_string] - 1
print (x)
my_Function()
KeyError: 'var1'. It appears like because it's in a function vars() is no longer using global variables and is maybe trying to look through local variables? But adding "global var1" to the top of my function doesn't seem to fix it.
I know one way to make it work would be to append my variable/dictionary to a list, ditch vars() entirely, and just call list[#]. But is it possible to avoid having a second copy of my variable? Pretend that I have a dictionary that's gigabytes in size and I'm trying to save resources. Is it possible? Is there maybe an alternative to vars() that I should be using? Thanks.