Commonly speaking, if you just say "Unix Time" or "Unix Timestamp", it is implied that you are speaking in terms of seconds. However, understand that the POSIX specification doesn't actually use these terms. Instead, it specifically says "Seconds Since the Epoch", defined in section 4.16, and used throughout the spec. That we call it "Unix Time" at all is just a colloquialism.
In section 20.3.1.1 of version 8.0 of the ECMAScript specification, the timestamp is not given terminology other than "milliseconds since 01 January, 1970 UTC".
Therefore, when referring to the timestamps used in JavaScript (and elsewhere), you're free to call it "Milliseconds Since the (Unix) Epoch", or "Unix Time in Milliseconds", or "Unix Timestamp in Milliseconds". There is no universally recognized colloquial term or standard term that is more concise.
A few other points on this subject:
The text on the W3Schools link you gave makes a crucial mistake in that it doesn't specify UTC. Since the Unix Epoch is defined in terms of UTC, all timestamps derived from it are UTC based.
Some may refer to these timestamps as "epoch time". However, this is nonsensical. Please avoid using this terminology, as it implies the time value itself is an epoch. Only the 0
value can be considered the epoch. One might say "epoch-based time", but even then, there's a question of "which epoch?". Though the Unix epoch is common, there are multiple different epochs used in computing.