Systems Engineering and RDBMS

Doing Deletes in an OLTP application

Posted by decipherinfosys on March 27, 2007

Last week while working at a client site, a question arose about handling deletes in an OLTP application.  This client application had over 700+ tables and was close to a third normal form implementation.  The application also had a large concurrent user load requirement.  The question that arose was whether the application should take care of doing deletes or should that task be handled by a scheduled job.  Also, another question was whether cascading DRI should be implemented or not in the system.  Yet another question was archiving of the data and allowing reporting based on the transactional data plus the archived data which is a very vast topic in itself and we will blog it at a later stage on the approach that we took for that one.

Regarding the deletes,  since some of the tables had circular references, implementing cascading DRI for those was not possible.  Cascading DRI was implemented though for some of the tables.  For others, we opted for updating a stat_code column for those tables with values that would indicate that that particular record is “Marked for Deletion”.  We opted for this technique because of a couple of various obvious reasons.  During the peak load time when there are a lot of concurrent users on the system, firing of the deletes would have meant validating the foreign keys in the child tables and that could have resulted into a performance bottleneck.  In addition, internal fragmentation issues could arise as well since in this application, they needed to delete stale data very often.  Using partitioning schemes, we could mitigate that issue but that cannot be done in the short term due to the application changes involved.

So, the code now marks the record for deletion using the stat_code column in the tables and the nightly scheduled task does the purges and the archiving logic.  If the internal fragmentation is above a certain threshold, it also takes care of that.  This client had a window of 4 hours every night where this work could be done so the implementation was pretty smooth.  Only the actual transactional data plus 4 weeks worth of old data was persisted in the OLTP database – rest was purged and archived.  And currently, we are building up a data mart for this client to facilitate BI reporting.  Pretty interesting and challenging project.

Sorry, the comment form is closed at this time.

%d bloggers like this: