4

Thanks for looking.

Background

In my .NET applications I usually have a Business Logic Layer (BLL) containing my business methods and a Data Access Layer (DAL) which contains my Entitiy classes and any methods for dealing with atomic entities (i.e. CRUD methods for a single entity). This is a pretty typical design pattern.

Here is a pseudocode example of what I mean:

BLL

public static int CreateProduct(ProductModel product){
    return DAL.SomeClass.CreateProduct(new DAL.Product{
       Name = product.Name,
       Price = product.Price
    });
}

DAL

public int CreateProduct(Product p){
    var db = new MyDataContext();        
    db.Products.AddObject(p);
    db.SaveChanges();
    return p.Id;
}

No problems with this simple example.

Ideally, all the business of instantiating a data context and using that data context lives in the DAL. But this becomes a problem if I attempt to deal with slightly more complex objects:

BLL

public static int CreateProduct(ProductModel product){
    return DAL.SomeClass.CreateProduct(new DAL.Product{
       Name = product.Name,
       Price = product.Price,
       ProductType = DAL.SomeClass.GetProductTypeById(product.ProductTypeId) //<--PROBLEM
    });
}

Now, instead of saving the entity, I get the following error:

An entity object cannot be referenced by multiple instances of IEntityChangeTracker

Ok, so the answer to dealing with that is to pass a common data context to both calls:

BLL

    public static int CreateProduct(ProductModel product){

using{var db = new DAL.MyDataContext()){

        return DAL.SomeClass.CreateProduct(new DAL.Product{
           Name = product.Name,
           Price = product.Price,
           ProductType = DAL.SomeClass.GetProductTypeById(product.ProductTypeId, db) //<--CONTEXT
        }, db); //<--CONTEXT
    }
}

Problem

This solves the immediate problem, but now my referential transparency is blown because I have to:

  1. Instantiate the data context in the BLL
  2. Pass the data context to the DAL from the BLL
  3. Create overridden methods in the DAL that accept a data context as a parameter.

This may not be a problem for some but for me, since I write my code in a more functional style, it is a big problem. It's all the same database after all, so why the heck can't I deal with unique entities regardless of their data context instance?

Other Notes

I realize that some may be tempted to simply say to create a common data context for all calls. This won't fly as doing so is bad practice for a multitude of reasons and ultimately causes a connection pool overflow. See this great answer for more details.

Any constructive input is appreciated.

Community
  • 1
  • 1
Matt Cashatt
  • 23,490
  • 28
  • 78
  • 111
  • interested to know how your style has changed or not over the last 4 years? – Alex Gordon Jan 11 '18 at 02:13
  • @l--''''''---------'''''''''''' Honestly, in the past four years I have gotten more senior (dumber) and delegate more (lazier) so I can't say that my view has changed that much. If I come up with more insight I will be sure to post it for you. – Matt Cashatt Jan 12 '18 at 03:51
  • thanks! i guess the main question is how do you write without side effects in a non-functional language, at least how do you keep your side effects secluded – Alex Gordon Jan 12 '18 at 17:04

1 Answers1

2

Personally, I track my unit of work and associate a data context to it via static methods. This works great if you aren't talking about operations with long lifetimes, such as my current project, an ASP.NET application, where every request is a (mostly) distinct unit and a request start and end coincide with the unit start/end. I store data context in the request CurrentContext, which, if you aren't familiar with it, is basically a dictionary managed by the system that allocates a request-specific storage accessible by static methods. The work's already done for me there, but you can find lots of examples of implementing your own unit of work pattern. One DbContext per web request... why?

Another equally workable answer for many is injection. Used for this purpose (injecting datacontext), it basically mimics the code you wrote at the end of your question, but shields you from the "non-functional" stuff you dislike.

Yes, you are only accessing one database, but if you look closely, you will see the database is not the constraint here. That is arising from the cache, which is designed to permit multiple, differing, concurrent copies of the data. If you don't wish to permit that, then you have a whole host of other solutions available.

Community
  • 1
  • 1
shannon
  • 8,664
  • 5
  • 44
  • 74
  • Thanks for your input. Your last paragraph nails the situation for me. So, would you say that simply turning off tracking is the solution? – Matt Cashatt Jul 19 '14 at 18:16
  • LOL, were there three (3) paragraphs when my last paragraph was of most interest to you? :) – shannon Jul 19 '14 at 18:30
  • Yes. The paragraph about the data context being cached. That makes sense even if it bugs me how it works ;). – Matt Cashatt Jul 19 '14 at 18:40
  • Other options: Note that your reference to an answer stating you shouldn't hold a connection assumes concurrency. If you don't wish to manage a unit of work because it is all one unit, then you are already making that concession. So the 'badness' arising from holding a single context is mostly not present. You could, just to err on the side of caution, dump the context each SaveChanges. You could detach the objects and reattach them safely, since there can be no externally-driven conflicts in your non-concurrent application operations. You could manage your own cached elements. – shannon Jul 22 '14 at 19:06
  • But, generally speaking, "DataContext is ideally suited for a 'unit of work' approach". http://blogs.msdn.com/b/dinesh.kulkarni/archive/2008/04/27/lifetime-of-a-linq-to-sql-datacontext.aspx Avoiding creating discrete work units is dumping a lot of the features of it, and really limiting the flexibility of your data layer. Since you are concerned about design "correctness", you may wish to concede the effort to manage units of work. This also aligns you with transaction rollbacks, etc. – shannon Jul 22 '14 at 19:09
  • It's probably relevant to note, that the flexibility present in the conflict detection, change tracking, and the other features that seem to make this an implementation burden, are intended to support a consumer-naive data layer by implementing many of the most commonly-used features of data consumers. So, what I meant above is that this means, I suppose, that disabling these features could be considered a design flaw, by creating a very strong dependency in your data layer on the non-concurrent operation of the modules above it. Always tradeoffs, right? – shannon Jul 23 '14 at 03:48