I am interested in exploring the use of XLM docs to store audit trail records
from a relational database application. At a conceptual level, an OS file
would be opened each time the application 'started'. A record of the file
name and initialization timestamp would be created in the DB.

Each DB table would have a 'write trigger'. Whenever the 'write trigger'
fires, a new XML doc would be created. Its cointent would represent the 'after
image' of the DB record. This proces would continue until the application
is 'closed', at which time an SHA-1 (or comparable) 'hash' would be computed
on the XML file. The hash would be stored as part of the Audit trail record
in the DB. Finally, the XML file would be 'closed' and the application terminated.

The 'integrety' of a particular XML file could then be validated at any point
by recomputing its 'hash' and comparing it to the 'hash' value stored in
the DB record for the file.

Theoretically, the size of each of these 'daily XML audit logs' could be
extremely large. The collection of daily logs over a period of weeks and
years would represent a massive number of individual XML documents.

Another capability that must be supported is the ability to occassionally
'query' the set of XML docs. Response time here is not a particular concern
- minutes as opposed to seconds would be acceptable.
Basedf on the performance characteristics of Web search engines such as Google,
this would not appear to be an unreasonable expectation.

I would be interested in any thoughts/opinions/pro's and con's regarding
this approach. While I am aware that XML DB's and other repositories exist,
my preference would be to utilize simple OS files. For example, the simple
OS files are very compatible with the concept of using 'hashes' to confirm
their integrity. Also, it is important that the data be 'immune' from technological
changes over 10 - 20 years or more.