11

I'm looking for an example of an XSS vulnerability that would be stopped just by using the AntiXSS Encoder 4.1 Beta as the runtime encoder (setting in system.web/httpRuntime). I would prefer something that doesn't require any explicit calls to AntiXss functions such as

@AntiXss.JavaScriptEncode(ViewBag.UserName)

I'm thinking something that would get by the ASP.NET blacklist but wouldn't make it through the AntiXSS whitelist, maybe something to do with alternate character sets or encoding?

I've tested UTF-7 vulnerabilities, but don't see any that seem to affect modern browsers.

Goyuix
  • 23,614
  • 14
  • 84
  • 128
Jon Galloway
  • 52,327
  • 25
  • 125
  • 193

2 Answers2

5

There aren't any. Well, that's not entirely true, they aren't any that run on modern browsers.

The reason the SDL requires it is that using a safe list is inherently more secure - so if suddenly someone discovers a character that is problematic it may be already encoded (depending on the safe lists you configure).

blowdart
  • 55,577
  • 12
  • 114
  • 149
  • 1
    There are on modern browsers the specify the encoding to be utf-7. In that case htmlencoding won't make much of a difference - so there are issues there. If you don't specify your pages as utf-7, and someone is able to inject an encoding type into your page via another css vulnerability - then that's another attack vector. – Adam Tuliper Apr 27 '11 at 19:52
  • 1
    Well on certain older browsers things would go wonky with, for example accented S characters, where you could go ścript and it would act as a script tag. – blowdart Apr 27 '11 at 21:29
  • 1
    doesnt ie7 though default to utf-7 by sniffing if it finds any utf-7 within the first 4000 bytes? so if you can inject utf-7 into IE7 in the first 4000, then its executed. ie8+ blocks this. – Adam Tuliper Apr 28 '11 at 01:53
2

hmm... I'm not following - antixss requires explicit calls unless you are talking of using .net 4s feature of specifying your own encoder and in turn calling off to it? In that case there is nothing known at this point that I'm aware of. Since AntiXss works off a whitelist there should be no issues, as everything but a few characters are encoded.

fyi - locally I can get utf-7 to work just fine:

<HEAD><META HTTP-EQUIV="CONTENT-TYPE" CONTENT="text/html; charset=UTF-7"> </HEAD>+ADw-SCRIPT+AD4-alert('XSS');+ADw-/SCRIPT+AD4-
Adam Tuliper
  • 29,982
  • 4
  • 53
  • 71
  • Thanks - I'm talking about using it as the ASP.NET encoder. With 4.1 you don't have to explicitly call into it. The UTF-7 attacks are interesting, but you have to specify it as the charset now, so it's not too much of a threat in modern browers, right? – Jon Galloway Apr 27 '11 at 16:53
  • I said basically the same thing above that there are no known at this point (three hours before the other answer) - a reason this wasn't chosen as the answer? In addition you will not just be protected from XSS by encoding. IE 7 (modern or not - still more than 5% of all users are on ie7 - more than enough for an attack) is vulnerable to utf-7 attack as it doesnt change the encoding when script is detected, as does IE8+ as per: http://msdn.microsoft.com/en-us/library/dd565635%28v=vs.85%29.aspx – Adam Tuliper Apr 27 '11 at 19:50
  • Can you give me a ref on 4.1 automatically doing this now? I'm looking on the net and didn't find anything initially. – Adam Tuliper Apr 27 '11 at 20:27
  • I accepted @blowdart's answer since he's the lead on AntiXSS and I had a separate e-mail thread going with him on this. I upvoted you as well. – Jon Galloway Apr 27 '11 at 21:52
  • Ha - funny. he and I spoke via email last week.. small world on here : ) It doesn't happen automatically - you still need to override an encoder in your config - I think this was just my misunderstanding in thinking you were talking about .net 4.1, and not the encoder : ) – Adam Tuliper Apr 28 '11 at 01:53