0

I'm scraping the nba top shop website to identify my spot in the queue (You can easily wait hours before it's your turn). I want to make a tool that ill email you when it's your turn, so you don't spend all day on the computer. The issue is the value I scrape using request comes up as "calculating" even though comes up as a number of the web page.

Here's my code. The won't work by the time you are reading this as the queue is over.

from bs4 import BeautifulSoup
from urllib import request

site = request.urlopen("https://queue.nbatopshot.com/?c=dapperlabs&e=f2yggcaz7t&t=https%3A%2F%2Fwww.nbatopshot.com%2Fapi%2Fpurchase%2Fqueue-callback%3FlistingID%3D279f41e1-84b4-41bd-81b2-61334e64c86c&cv=34890450&cid=en-US").read()


soup = BeautifulSoup(site)


Line_number = soup.find(id="MainPart_lbUsersInLineAheadOfYou")

print(Line_number)

Here is what I see when I inspect the site:

<span id="MainPart_lbUsersInLineAheadOfYou" data-bind="visible: layout.usersInLineAheadOfYouVisible, text: ticket.usersInLineAheadOfYou">27581</span>

Here is what I see when I run the requests:

<span data-bind="visible: layout.usersInLineAheadOfYouVisible, text: ticket.usersInLineAheadOfYou" id="MainPart_lbUsersInLineAheadOfYou" style="display: none;">calculating...</span>

Any advice on this?

Insula
  • 999
  • 4
  • 10
  • Does this answer your question? [Web-scraping JavaScript page with Python](https://stackoverflow.com/questions/8049520/web-scraping-javascript-page-with-python) – ggorlen Feb 27 '21 at 22:59

1 Answers1

0

Try adding "time=None" at the end of your urlopen, like so:

from bs4 import BeautifulSoup
from urllib import request

site = request.urlopen("https://queue.nbatopshot.com/?c=dapperlabs&e=f2yggcaz7t&t=https%3A%2F%2Fwww.nbatopshot.com%2Fapi%2Fpurchase%2Fqueue-callback%3FlistingID%3D279f41e1-84b4-41bd-81b2-61334e64c86c&cv=34890450&cid=en-US", timeout=None).read()


soup = BeautifulSoup(site)


Line_number = soup.find(id="MainPart_lbUsersInLineAheadOfYou")

print(Line_number)

You can read more here about timeouts.

If that doesn't work you may need to resort to using Selenium.

Insula
  • 999
  • 4
  • 10