Under what conditions should a developer consider changing the default isolation level explicitly defined by the framework?
Thanks!
Under what conditions should a developer consider changing the default isolation level explicitly defined by the framework?
Thanks!
Almost never.
There are many clever RDBMS, .net, Jave etc engineers working for Microsoft, Oracle etc who know more than the average code monkey (me and you) about this kind of thing
Some SO questions about dirty reads:
And a recent blog article from SQL Server support team: "Inappropriate usage of high isolation level isn’t just about blocking when it comes to performance"