3

I have a table named Warehouse for my database, it has Warehouse_idWarehouse and Warehouse_name as primary keys. What i want to do is to efficiently store a maximum of N recent changes that have been made to each warehouse that is stored in the table. I have considered creating a "helper" table (e.g. warehouse_changes) and taking care of the updates through my application, but honestly it feels like there is a smarter way around this.

Is there a way to store a specific amount of entries per warehouse and automatically manage updating the right element through mysql workbench? Thanks in advance and keep in mind that i'm not particularly advanced in this field.

Mark
  • 48
  • 4
  • What does this mean "Is there a way to store a specific amount of entries per warehouse and automatically manage updating the right element through mysql workbench?"? For instance, can you describe how it would ideally work i.e., walk through examples of how this works over time? (Also, I don't understand what 'element' means in database terminology.) – AgRizzo Nov 04 '13 at 12:48
  • possible duplicate of [Is this the best approach to creating an audit trail?](http://stackoverflow.com/questions/711597/is-this-the-best-approach-to-creating-an-audit-trail) – ModulusJoe Nov 04 '13 at 12:59
  • Thanks for the quick answer! So, consider that you have the warehouse table, into which entries gather over time (new entries are inserted through the application). Ideally, for each entry inside the warehouse table ( which is what i mean by element) i would like a way to store say 15 records of changes that have been made to this warehouse and its capacity to store products. I would like a maximum of 15 records with data "added of to this warehouse", or "moved of to ". – Mark Nov 04 '13 at 13:01

1 Answers1

1

There is a very detailed article on O'Reilly Answers that describes how to do exactly what you want using triggers.

When explained in two words, you need to create a helper table and a trigger per each operation type that you want to store. For example, here's how a trigger for updates looks like according to that article:

-- Creating a trigger that will run after each update
-- for each affected row
CREATE TRIGGER au_warehouse AFTER UPDATE ON Warehouse FOR EACH ROW

BEGIN

  -- Insert new values into a log table, or you can insert old values
  -- using the OLD row reference
  INSERT INTO warehouse_log (action, id, ts, name)
      VALUES('update', NEW.id, NOW(), NEW.name);

END;

After that you can get the latest 1000 changes using a simple SQL query:

SELECT * FROM warehouse_log ORDER BY ts DESC LIMIT 1000;
Igor Zinov'yev
  • 3,676
  • 1
  • 33
  • 49
  • Thanks for the answer. One question though: Considering the fact that i want to store a relatively small amount of "logs" for each warehouse, 20 at most, wouldnt this solution use up more space than is necessary? I'm trying to go for the most efficient solution. – Mark Nov 04 '13 at 13:05
  • This is, as far as I can see, the most efficient way to track changes. You can also do it from the application and check to see if the log gets longer than necessary, but that would be less efficient. Perhaps if you're worried about the space the log would take you can set up an automatic truncation script that would run a query onec each 24 hours to clean up unneeded entries. But that's a different question :) – Igor Zinov'yev Nov 04 '13 at 13:11
  • Certainly. From a point onwards this discussion is somewhat academic in nature, seeing as i'm making more of a "simulation" than a real db-driven application. But, if in the future i want to use this for something more concrete, i'll obviously have to take these issues into consideration. I think i will go with the trigger solution, it seems the most feasible and relatively simple. Thanks for your help! – Mark Nov 04 '13 at 13:17