Here's how to do it
Open the page in chrome. Now open the developer console in chrome. Click on the 'Network' tab. Now refresh the page.
This tab shows you requests as they're made (you can see about 8 or so items).
Manual inspection gives us the one we want:
https://www.nseindia.com/live_market/dynaContent/live_watch/stock_watch/niftyStockWatch.json
This is the link where the data resides.
Now, to get it into a csv (which can be opened in excel), use R's rvest package:
library(rvest)
library(jsonlite)
url <- "https://www.nseindia.com/live_market/dynaContent/live_watch/stock_watch/niftyStockWatch.json"
page_html <- read_html(url)
data <- html_nodes(page_html, "p")
data <- html_text(data)
data <- fromJSON(data)
write.csv(data$data, "scrapedData.csv", row.names=FALSE)
If you want this to be 'live' data, you can run the scrape at (say) 5 second intervals.