5

I'm getting the following error on the second refresh of a page: DetachedInstanceError: Instance is not bound to a Session; attribute refresh operation cannot proceed

DetachedInstanceError: Instance <MetadataRef at 0x107b2a0d0> is not bound to a Session; attribute refresh operation cannot proceed

 - Expression: "result.meta_refs(visible_search_only=True)"
 - Filename:   ... ects/WebApps/PYPanel/pypanel/templates/generic/search.pt
 - Location:   (line 45: col 38)
 - Source:     ... meta_ref result.meta_refs(visible_search_only=True)" tal:omi ...
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 - Arguments:  repeat: {...} (0)
               renderer_name: templates/home.pt
               models: <list - at 0x1069e4d88>
               results: <list - at 0x107c30d40>
               renderer_info: <RendererHelper - at 0x1069b5650>
               active_models: <list - at 0x107b69050>
               query: 
               pagination: <NoneType - at 0x104dd5538>
               req: <Request - at 0x107b4e690>
               title: <NoneType - at 0x104dd5538>
               generic: <NoneType - at 0x104dd5538>
               request: <Request - at 0x107b4e690>
               context: <RootFactory - at 0x107b12090>
               page: 1
               view: <Page - at 0x107b128d0>

The issue seems to be some sharing of cached data between requests. The thing is that it's only supposed to be cached locally (i.e. re-query everything for the next request)

The relevant section of the template is:

        <div tal:repeat="meta_ref result.meta_refs(visible_search_only=True)" tal:omit-tag="True">
            <div tal:define="meta result.meta(meta_ref.key, None)" tal:condition="meta is not None">
                <div>${meta_ref.name} = ${meta}</div>
            </div>
        </div>

My DBSession is only declared once, in models.py (if that makes a difference):

DBSession = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))

If I stop caching it fixes it, which means I just need to make it not cache between requests, which I don't know how to do.

This is my meta_refs function:

def meta_refs(self, visible_only=False, visible_search_only=False):
    model = self.__class__.__name__
    if Base._meta_refs is None:
        Base._meta_refs = {}
        try:
            for result in DBSession.query(MetadataRef):
                if result.model not in Base._meta_refs:
                    Base._meta_refs[result.model] = []
                Base._meta_refs[result.model].append(result)
        except DBAPIError:
            pass
    if model not in Base._meta_refs:
        return []
    results = []
    for result in Base._meta_refs[model]:
        #@TODO: Remove temporary workaround
        if inspect(result).detached:
            Base._meta_refs = None
            return self.meta_refs(visible_only, visible_search_only)
        #END of workaround
        if visible_only and result.visible is False:
            continue
        if visible_search_only and result.visible_search is False:
            continue
        results.append(result)
    return results

It's also worth noting that the meta() function also caches and doesn't have the same issue -- I think likely the key difference is that it caches a dict of strings instead of ORM objects.

I'm using pserve to serve it while I'm developing it (also if that makes a difference)

The temporary workaround in my code, using sqlalchemy.inspect, does work, but I really just want the data to just not persist (i.e. Base._meta_refs should equal None the first time I access it 100% of the time).

Anyone have any ideas? If this is being cached between requests, I'm sure there is other stuff that is as well, and that's too much potential for unexpected behavior.

Chelsea Urquhart
  • 1,388
  • 1
  • 11
  • 18

1 Answers1

3

Assuming Base is a class, you use its _meta_refs attribute to store MetadataRef instances and effectively keep them persisted between requests.

If SQLAlchemy Session identity map that in many cases works like a cache is not enough you could use request object to store those objects and know that they will only persist for a lifetime of the request.

And I'd simplify meta_refs method like following:

@classmethod
def meta_refs(cls, visible_only=False, visible_search_only=False):
    q = DBSession.query(MetadataRef).filter(MetadataRef.model==cls.__name__)
    if visible_only:
        q = q.filter(MetadataRef.visible==True)
    if visible_search_only:
        q = q.filter(MetadataRef.visible_search==True)

    # It might be worth returning q rather than q.all()
    return q.all()
Alex K
  • 633
  • 6
  • 11
  • While I'm all for simplification, I need all the data via 1 sql query (& don't want them to persist between requests)... It takes 4.3ms avg to retrieve all the rows or 3.8ms avg to retrieve them for a single model, so inefficient if I'm retrieving more than one model which is necessary for most requests. The rows are tiny btw which makes a difference in this. Anyway how would I go about storing data in the request object? That sounds feasible, but I can't seem to find documentation on doing something like that. – Chelsea Urquhart Jul 19 '14 at 02:52
  • I did not know you need MetadataRef for more than one model during serving single request but my point was that your code is way too complicated for what it seems to be doing. SQLAlchemy's Query is great tool and there is no good reason to avoid using it in favour of loops and "manual" filtering. – Alex K Jul 19 '14 at 14:07
  • Anyway, the answer to your question is in very first sentence of my original post. Please feel free to disregard the rest of it. – Alex K Jul 19 '14 at 14:14