This code was written in VS Code, Python. I have a minimum variable in my code and another variable. Let's call them X and Xmin. I give Xmin and X numbers. Then when I compare them with < my code tells me that the smaller one is larger. Here is my code
Xmin = 100
print("X")
X = input()
if X < Xmin:
print("X is too small.")
The problem is when I make X = 500, it will tell me that X is greater than Xmin, but when I give X something really big, like 1000000, it will tell me that X is too small.