0

We have a Online Admission Counselling process in our Country, but their form to check the ranks is a bit inconvenient, and even more, there is no option to sort by ranks, and it makes it very difficult to find that which institute/course had the highest/lowest rank. I wrote a simple python requests and pandas program that gets the table from that page, and saves it as csv. Then I can do whatever I want to.

Though, It didn't work for the first time, it had no tables in it. I thought it may be due to headers/cookies/correct_form_data. Therefore I copied the cURL request from ChromeDevTools, converted it to python requests, and ran that script, and I got the tables. That request call has cookies (which might contain my login info), the headers, and the data (which had the fields that were on the page, but also some looooong encoded data). I experimented and found that the request works without headers and cookies (you don't need to be logged in to check the ranks), but not without that loooong encoded data. After that it did work, I saved the csv and sorted to to my requirements.

Therefore Decided to make it a webapp, so that all students can use it. I designed a simple page for that, but the thing is, But it seems that I can't send a POST request from the client's browser due to some CORS security issue.

Access to XMLHttpRequest at 'https://josaa.admissions.nic.in/applicant/SeatAllotmentResult/CurrentORCR.aspx' from origin 'http://127.0.0.1:5500' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
POST https://josaa.admissions.nic.in/applicant/SeatAllotmentResult/CurrentORCR.aspx net::ERR_FAILED 200

Now some possible solution(s)

  • If possible, I want to host on github pages directly from projects repo. This calls the need for using javascript/brython, and I have already tried both with no success.
  • can github actions be used for this task? Since it runs on containers, and not on browser. This way I can fetch and save csv files and used those on the website
  • if yes, then the most I could find was to use schedule cron job to automatically download the csv everyday from github actions (though it only needs to run in June-August)
  • or more efficient, if a user submits the form on the gh pages, then only the action triggers, fetches the csv, and then the website process it?
HarshNJ
  • 3
  • 3

0 Answers0