Just to clarify, the following points are all targeted at checks and changes to DATA to ensure it's consistent as per a new schema.As of 1.5, we have a great database schema maintenance tool in place (sql/changes/) which remembers which changes have already been applied to the schema and which still have to be applied. This new tool has gotten rid of the truckload of warnings and errors that the repeated loading of Fixes.sql gave us on 1.3 and 1.4.Preparing to merge my MC branch to master for 1.6, I'm thinking that the sql/changes/ infrastructure works perfectly to handle schema changes.
However, if such a schema change would be accompanied with the need to run a data migration, the sql/changes/ infrastructure suddenly won't work too great anymore, because I'm expecting that we'll want to improve on data migrations as we gain knowledge of the various data sets out there.
Regardless of the technical limitations of the sql/changes/ infrastructure, I'm also envisaging that we'll want to do "pre-migration checks" and allow users to correct data before the data migration (to a changed schema) is executed, if they have data that's non-compliant. While the pre-migration checks are definitely an easy extension to the sql/changes/ infrastructure, it's hard to provide feedback from the checks as it is. The other requirement (fixing data) can't be resolved at all.
My current thinking is that schema improvement becomes a continuously incremental thing instead of these one-off huge jumps such as we had on 1.3 and 1.4, *if* we find a way to address these requirements:
For example, there could be a case where a date that should always be set historically hasn't been sometimes.
A new schema may now have a constraint requiring it to be set, but the dataset needs to have the missing dates added.
1a. generate a "pre-check" report1. Run pre-checks on schema changes
2a. generate a "post-corrections" report2. Allow users to adjust / add / delete data in order for the pre-checks to be satisfied
6. Ask the user to save the two reports and contact the team on #ledgersmb to discuss what needed manual adjustment.3. Run post-schema-change data migrations
4. Allow data migration for the same change to be tweaked (i.e. different data migrations == different sha-sums)
5. Allow additional data migrations to be run *iff* (if-and-only-if) a specific data migration has been run in the past (my idea here is to allow the data to be "upgraded" according to progressive insight)
The Logs can be provided *IF* it's required.
It may be possible to come up with a way to anonymise the logs and provide an optional step to do so.
Other than my above insertions I think it's a sound plan that not only improves our ease of data migration, but also should drastically simplify imports from other software such as SQLLedger (which we have had quite a few migrations in the last 12 months) and hopefully others such as Quickbooks, MYOBI would really love to hear your thoughts on the topic!
------------------------------------------------------------------------------ Check out the vibrant tech community on one of the world's most engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________ Ledger-smb-devel mailing list ..hidden.. https://lists.sourceforge.net/lists/listinfo/ledger-smb-devel