i'm following a course of Computer Architecture and our teacher gave us the code below and told us to execute it, so we can comment on it together in class, but, I can't understand why it creates an endless loop. I would love if someone more experienced could help. Thank you!
#include <stdio.h>
int main() {
float x = 0.1;
while (x!=1.1) {
printf("x = %f\n", x);
x = x + 0.1;
}
return 1;
}