This is because the default implementation of UrlEncode is based on the UTF8 character encoding. Actually this is entirely within your control.
For example, the following code:
string sample = new string((char)0x0DF, 1);
string test = HttpUtility.UrlEncode(sample);
Console.WriteLine("UTF8 Ecoded: {0}", test);
test = HttpUtility.UrlEncode(sample, Encoding.GetEncoding(1252));
Console.WriteLine("1252 Ecoded: {0}", test);
Outputs the following:
UTF8 Ecoded: %c3%9f
1252 Ecoded: %df
Of course the danger with using another encoding on a URI is that some characters can not be represented at all...
for example, this code:
string sample = new string((char) 312, 1);
Encoding encoding = Encoding.GetEncoding(1252);
string test = HttpUtility.UrlEncode(sample);
Console.WriteLine("UTF8 Ecoded: {0}, round-trip = {1}", test, sample == HttpUtility.UrlDecode(test));
test = HttpUtility.UrlEncode(sample, encoding);
Console.WriteLine("1252 Ecoded: {0}, round-trip = {1}", test, sample == HttpUtility.UrlDecode(test, encoding));
Console.ReadLine();
Will output the following:
UTF8 Ecoded: %c4%b8, round-trip = True
1252 Ecoded: %3f, round-trip = False
You can see in the later example the encoding is "%3f" which, when unencoded is equal to a question mark "?", not the input character of 312 (0x138).
In a nutshell there is nothing wrong with encoding "ß" as "%c3%9f", to the contrary, it is the correct representation. Yet if you must have the encoding "%DF for the remote server to correctly decode it, then use the 1252 codepage as shown.