No, there's no particular logic to it, it's just the rules that Brendan Eich decided on during those fraught ten days in May 1995. The full rules are here. Eich just decided that allowing $
in identifier names would be handy. He also allowed _
(which is more common in programming languages), as well as the usual set of English letters and numbers (where numbers can't appear in the first character).
At one stage, an attempt was made in the specification to retcon $
as "intended for use only in mechanically generated code." This language appeared for the first time in ECMAScript 2nd edition (pdf), not being present in the 1st edition (pdf). It remained in the 3rd edition (pdf), and then disappeared in the 5th edition (there was no accepted 4th edition). This question and its answers address this, apparently it was an attempt to use a convention that originated in Java. But people glommed onto $
(not least John Resig with jQuery and Sam Stephenson with PrototypeJS), so, well, that ship has long-since sailed, hence dropping that language from the 5th edition spec.
Ultimately, a wide range of Unicode characters were allowed (for instance, fairly famously, ಠ_ಠ
is a valid JavaScript identifier), but that was much later, in the ECMAScript 3rd edition (implemented by Mozilla in JavaScript 1.5).