I'd like to use cookies in my ASP.Net project. A cookie I set like this:
HttpCookie cookie = new HttpCookie(name);
cookie.Expires = DateTime.Now.AddYears(1);
cookie.Value = data;
HttpContext.Current.Response.Cookies.Set(cookie)
and get as:
var cookieCollection = HttpContext.Current.Request.Cookies;
var cookie = cookieCollection[name].Value;
When I update the cookie I basically just do a get, modify the property in question, and then set. Now, what I noticed is that when I update a cookie, .Net will add a new cookie to the cookieCollection with the same name, and set the old one as expired:
cookieCollection[0]
{System.Web.HttpCookie}
Domain: null
Expires: {01-01-0001 00:00:00}
HasKeys: false
HttpOnly: false
Name: "myCookieUc"
Path: "/"
Secure: false
Shareable: false
Value: "{...}"
Values: {...}
cookieCollection[1]
{System.Web.HttpCookie}
Domain: null
Expires: {21-01-2018 11:00:10}
HasKeys: false
HttpOnly: false
Name: "myCookieUc"
Path: "/"
Secure: false
Shareable: false
Value: "{...}"
Values: {...}
Thus, after I have initially set and updated my cookie, I will have two cookies in my cookieCollection. So when I try to do a get:
var cookie = cookieCollection[name].Value;
Then for some reason I seem to get the cookie that have already expired.
The cookie update in .Net is triggered from an Ajax call. It may update the cookie multiple times in the same .Net method.
So my question is, when I do:
var cookie = cookieCollection[name].Value;
Why does it give me the expired cookie? Can I do a Linq instead to get the non-expired cookie?