I'm new to web development, this might be a question with a simple answer, but I have been stuck on this problem for hours and can't seem to solve it.
A friend of mine is developing a pretty standard serverless web app using AWS, to which I have started to contribute. We are running python lambda functions connected to API gateway. These work fine, and pass all of my tests. I am attempting to make requests to the API with js scripts running in html files, which are currently living on my local machine.
However, when I write my js code to make API requests using the jQuery ajax function, and run the html/js in my browser on my localhost, the requests are not sent because of CORS errors, even though our API is configured to allow CORS. I have read several articles about CORS and I grasp its purpose and basic mechanics. I have also read the full jQuery ajax documentation, and reviewed HTTP requests and responses.
From what I have read, it seems that I in fact shouldn't be able to make requests from my localhost to an API hosted on AWS, since they are in fact hosted on different 'origins,' However, I am confused because my friend's code does not produce these errors, and after hours of comparing our js code, neither of us can find any meaningful differences in our scripts. We are both stumped. I found some workarounds that entail disabling various browser security features, but my friend thinks that is risky and unnecessary.
So I have two questions that I haven't been able to find clear answers to:
Should I be getting CORS errors in this situation? Is there a way to avoid them without disabling browser security settings?
More generally how do people typically develop the front-end of a web app locally while making calls to a remote API? Am I approaching this project in the wrong way?
Thank you thank you!